[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 12081 1726882379.94911: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 12081 1726882379.95207: Added group all to inventory 12081 1726882379.95209: Added group ungrouped to inventory 12081 1726882379.95212: Group all now contains ungrouped 12081 1726882379.95214: Examining possible inventory source: /tmp/network-91m/inventory.yml 12081 1726882380.04510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 12081 1726882380.04573: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 12081 1726882380.04596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 12081 1726882380.04657: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 12081 1726882380.04733: Loaded config def from plugin (inventory/script) 12081 1726882380.04735: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 12081 1726882380.04777: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 12081 1726882380.04861: Loaded config def from plugin (inventory/yaml) 12081 1726882380.04865: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 12081 1726882380.04937: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 12081 1726882380.05250: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 12081 1726882380.05253: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 12081 1726882380.05255: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 12081 1726882380.05260: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 12081 1726882380.05263: Loading data from /tmp/network-91m/inventory.yml 12081 1726882380.05306: /tmp/network-91m/inventory.yml was not parsable by auto 12081 1726882380.05351: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 12081 1726882380.05381: Loading data from /tmp/network-91m/inventory.yml 12081 1726882380.05435: group all already in inventory 12081 1726882380.05440: set inventory_file for managed_node1 12081 1726882380.05443: set inventory_dir for managed_node1 12081 1726882380.05443: Added host managed_node1 to inventory 12081 1726882380.05445: Added host managed_node1 to group all 12081 1726882380.05446: set ansible_host for managed_node1 12081 1726882380.05446: set ansible_ssh_extra_args for managed_node1 12081 1726882380.05448: set inventory_file for managed_node2 12081 1726882380.05450: set inventory_dir for managed_node2 12081 1726882380.05451: Added host managed_node2 to inventory 12081 1726882380.05452: Added host managed_node2 to group all 12081 1726882380.05453: set ansible_host for managed_node2 12081 1726882380.05453: set ansible_ssh_extra_args for managed_node2 12081 1726882380.05455: set inventory_file for managed_node3 12081 1726882380.05456: set inventory_dir for managed_node3 12081 1726882380.05457: Added host managed_node3 to inventory 12081 1726882380.05457: Added host managed_node3 to group all 12081 1726882380.05458: set ansible_host for managed_node3 12081 1726882380.05459: set ansible_ssh_extra_args for managed_node3 12081 1726882380.05461: Reconcile groups and hosts in inventory. 12081 1726882380.05465: Group ungrouped now contains managed_node1 12081 1726882380.05466: Group ungrouped now contains managed_node2 12081 1726882380.05467: Group ungrouped now contains managed_node3 12081 1726882380.05521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 12081 1726882380.05602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 12081 1726882380.05631: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 12081 1726882380.05649: Loaded config def from plugin (vars/host_group_vars) 12081 1726882380.05651: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 12081 1726882380.05657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 12081 1726882380.05662: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 12081 1726882380.05692: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 12081 1726882380.05917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882380.05984: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 12081 1726882380.06009: Loaded config def from plugin (connection/local) 12081 1726882380.06011: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 12081 1726882380.06360: Loaded config def from plugin (connection/paramiko_ssh) 12081 1726882380.06362: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 12081 1726882380.07112: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12081 1726882380.07143: Loaded config def from plugin (connection/psrp) 12081 1726882380.07145: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 12081 1726882380.07847: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12081 1726882380.07890: Loaded config def from plugin (connection/ssh) 12081 1726882380.07893: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 12081 1726882380.09809: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12081 1726882380.09848: Loaded config def from plugin (connection/winrm) 12081 1726882380.09854: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 12081 1726882380.09887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 12081 1726882380.09951: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 12081 1726882380.10021: Loaded config def from plugin (shell/cmd) 12081 1726882380.10027: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 12081 1726882380.10071: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 12081 1726882380.10131: Loaded config def from plugin (shell/powershell) 12081 1726882380.10133: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 12081 1726882380.10187: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 12081 1726882380.10377: Loaded config def from plugin (shell/sh) 12081 1726882380.10379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 12081 1726882380.10412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 12081 1726882380.10675: Loaded config def from plugin (become/runas) 12081 1726882380.10677: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 12081 1726882380.10876: Loaded config def from plugin (become/su) 12081 1726882380.10879: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 12081 1726882380.11041: Loaded config def from plugin (become/sudo) 12081 1726882380.11043: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 12081 1726882380.11081: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 12081 1726882380.11412: in VariableManager get_vars() 12081 1726882380.11433: done with get_vars() 12081 1726882380.11568: trying /usr/local/lib/python3.12/site-packages/ansible/modules 12081 1726882380.16309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 12081 1726882380.16633: in VariableManager get_vars() 12081 1726882380.16638: done with get_vars() 12081 1726882380.16641: variable 'playbook_dir' from source: magic vars 12081 1726882380.16642: variable 'ansible_playbook_python' from source: magic vars 12081 1726882380.16643: variable 'ansible_config_file' from source: magic vars 12081 1726882380.16643: variable 'groups' from source: magic vars 12081 1726882380.16644: variable 'omit' from source: magic vars 12081 1726882380.16645: variable 'ansible_version' from source: magic vars 12081 1726882380.16646: variable 'ansible_check_mode' from source: magic vars 12081 1726882380.16647: variable 'ansible_diff_mode' from source: magic vars 12081 1726882380.16647: variable 'ansible_forks' from source: magic vars 12081 1726882380.16648: variable 'ansible_inventory_sources' from source: magic vars 12081 1726882380.16651: variable 'ansible_skip_tags' from source: magic vars 12081 1726882380.16652: variable 'ansible_limit' from source: magic vars 12081 1726882380.16653: variable 'ansible_run_tags' from source: magic vars 12081 1726882380.16654: variable 'ansible_verbosity' from source: magic vars 12081 1726882380.16693: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml 12081 1726882380.17414: in VariableManager get_vars() 12081 1726882380.17432: done with get_vars() 12081 1726882380.17579: in VariableManager get_vars() 12081 1726882380.17595: done with get_vars() 12081 1726882380.17653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 12081 1726882380.17669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 12081 1726882380.17892: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 12081 1726882380.18035: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 12081 1726882380.18038: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 12081 1726882380.18073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 12081 1726882380.18096: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 12081 1726882380.18251: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 12081 1726882380.18314: Loaded config def from plugin (callback/default) 12081 1726882380.18316: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12081 1726882380.19426: Loaded config def from plugin (callback/junit) 12081 1726882380.19429: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12081 1726882380.19479: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 12081 1726882380.19547: Loaded config def from plugin (callback/minimal) 12081 1726882380.19552: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12081 1726882380.19592: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12081 1726882380.19656: Loaded config def from plugin (callback/tree) 12081 1726882380.19659: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 12081 1726882380.19787: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 12081 1726882380.19790: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_options_nm.yml ******************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 12081 1726882380.19822: in VariableManager get_vars() 12081 1726882380.19837: done with get_vars() 12081 1726882380.19844: in VariableManager get_vars() 12081 1726882380.19856: done with get_vars() 12081 1726882380.19861: variable 'omit' from source: magic vars 12081 1726882380.19902: in VariableManager get_vars() 12081 1726882380.19915: done with get_vars() 12081 1726882380.19938: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_options.yml' with nm as provider] ***** 12081 1726882380.20525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 12081 1726882380.20626: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 12081 1726882380.20665: getting the remaining hosts for this loop 12081 1726882380.20668: done getting the remaining hosts for this loop 12081 1726882380.20670: getting the next task for host managed_node3 12081 1726882380.20674: done getting next task for host managed_node3 12081 1726882380.20676: ^ task is: TASK: Gathering Facts 12081 1726882380.20677: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882380.20680: getting variables 12081 1726882380.20681: in VariableManager get_vars() 12081 1726882380.20692: Calling all_inventory to load vars for managed_node3 12081 1726882380.20695: Calling groups_inventory to load vars for managed_node3 12081 1726882380.20697: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882380.20710: Calling all_plugins_play to load vars for managed_node3 12081 1726882380.20723: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882380.20727: Calling groups_plugins_play to load vars for managed_node3 12081 1726882380.20769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882380.20822: done with get_vars() 12081 1726882380.20829: done getting variables 12081 1726882380.20895: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Friday 20 September 2024 21:33:00 -0400 (0:00:00.012) 0:00:00.012 ****** 12081 1726882380.20915: entering _queue_task() for managed_node3/gather_facts 12081 1726882380.20917: Creating lock for gather_facts 12081 1726882380.21248: worker is 1 (out of 1 available) 12081 1726882380.21260: exiting _queue_task() for managed_node3/gather_facts 12081 1726882380.21273: done queuing things up, now waiting for results queue to drain 12081 1726882380.21274: waiting for pending results... 12081 1726882380.21505: running TaskExecutor() for managed_node3/TASK: Gathering Facts 12081 1726882380.21615: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000015 12081 1726882380.21636: variable 'ansible_search_path' from source: unknown 12081 1726882380.21679: calling self._execute() 12081 1726882380.21742: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882380.21755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882380.21770: variable 'omit' from source: magic vars 12081 1726882380.21871: variable 'omit' from source: magic vars 12081 1726882380.21901: variable 'omit' from source: magic vars 12081 1726882380.21937: variable 'omit' from source: magic vars 12081 1726882380.21992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882380.22028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882380.22055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882380.22085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882380.22101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882380.22130: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882380.22138: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882380.22144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882380.22248: Set connection var ansible_pipelining to False 12081 1726882380.22260: Set connection var ansible_shell_type to sh 12081 1726882380.22275: Set connection var ansible_shell_executable to /bin/sh 12081 1726882380.22284: Set connection var ansible_connection to ssh 12081 1726882380.22292: Set connection var ansible_timeout to 10 12081 1726882380.22300: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882380.22325: variable 'ansible_shell_executable' from source: unknown 12081 1726882380.22332: variable 'ansible_connection' from source: unknown 12081 1726882380.22338: variable 'ansible_module_compression' from source: unknown 12081 1726882380.22343: variable 'ansible_shell_type' from source: unknown 12081 1726882380.22352: variable 'ansible_shell_executable' from source: unknown 12081 1726882380.22359: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882380.22369: variable 'ansible_pipelining' from source: unknown 12081 1726882380.22377: variable 'ansible_timeout' from source: unknown 12081 1726882380.22385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882380.22569: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 12081 1726882380.22585: variable 'omit' from source: magic vars 12081 1726882380.22596: starting attempt loop 12081 1726882380.22604: running the handler 12081 1726882380.23315: variable 'ansible_facts' from source: unknown 12081 1726882380.23342: _low_level_execute_command(): starting 12081 1726882380.23359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882380.25158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882380.25180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.25196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.25214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.25261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882380.25278: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882380.25293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.25311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882380.25324: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882380.25337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882380.25354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.25372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.25392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.25404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882380.25416: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882380.25430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.25512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882380.25536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882380.25555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882380.25930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882380.27556: stdout chunk (state=3): >>>/root <<< 12081 1726882380.27669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882380.27762: stderr chunk (state=3): >>><<< 12081 1726882380.27777: stdout chunk (state=3): >>><<< 12081 1726882380.27913: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882380.27916: _low_level_execute_command(): starting 12081 1726882380.27919: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325 `" && echo ansible-tmp-1726882380.2781296-12097-77882765978325="` echo /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325 `" ) && sleep 0' 12081 1726882380.28556: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882380.28572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.28593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.28619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.28668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882380.28682: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882380.28702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.28720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882380.28732: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882380.28743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882380.28759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.28776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.28794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.28811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882380.28823: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882380.28837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.28923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882380.28946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882380.28969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882380.29110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882380.31076: stdout chunk (state=3): >>>ansible-tmp-1726882380.2781296-12097-77882765978325=/root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325 <<< 12081 1726882380.31186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882380.31278: stderr chunk (state=3): >>><<< 12081 1726882380.31287: stdout chunk (state=3): >>><<< 12081 1726882380.31473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882380.2781296-12097-77882765978325=/root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882380.31476: variable 'ansible_module_compression' from source: unknown 12081 1726882380.31479: ANSIBALLZ: Using generic lock for ansible.legacy.setup 12081 1726882380.31481: ANSIBALLZ: Acquiring lock 12081 1726882380.31483: ANSIBALLZ: Lock acquired: 139893497835168 12081 1726882380.31485: ANSIBALLZ: Creating module 12081 1726882380.62007: ANSIBALLZ: Writing module into payload 12081 1726882380.62111: ANSIBALLZ: Writing module 12081 1726882380.62134: ANSIBALLZ: Renaming module 12081 1726882380.62137: ANSIBALLZ: Done creating module 12081 1726882380.62155: variable 'ansible_facts' from source: unknown 12081 1726882380.62161: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882380.62171: _low_level_execute_command(): starting 12081 1726882380.62180: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 12081 1726882380.62609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.62632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882380.62643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.62696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882380.62704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882380.62712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882380.62837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882380.64478: stdout chunk (state=3): >>>PLATFORM <<< 12081 1726882380.64543: stdout chunk (state=3): >>>Linux <<< 12081 1726882380.64580: stdout chunk (state=3): >>>FOUND <<< 12081 1726882380.64583: stdout chunk (state=3): >>>/usr/bin/python3.9 <<< 12081 1726882380.64585: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 12081 1726882380.64719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882380.64759: stderr chunk (state=3): >>><<< 12081 1726882380.64762: stdout chunk (state=3): >>><<< 12081 1726882380.64782: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882380.64796 [managed_node3]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 12081 1726882380.64848: _low_level_execute_command(): starting 12081 1726882380.64861: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 12081 1726882380.64997: Sending initial data 12081 1726882380.65000: Sent initial data (1181 bytes) 12081 1726882380.65557: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882380.65575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.65597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.65618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.65660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882380.65675: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882380.65689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.65716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882380.65729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882380.65740: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882380.65755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.65772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.65787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.65799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882380.65816: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882380.65832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.65907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882380.65929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882380.65946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882380.66073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882380.69845: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 12081 1726882380.70213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882380.70273: stderr chunk (state=3): >>><<< 12081 1726882380.70276: stdout chunk (state=3): >>><<< 12081 1726882380.70287: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882380.70334: variable 'ansible_facts' from source: unknown 12081 1726882380.70337: variable 'ansible_facts' from source: unknown 12081 1726882380.70348: variable 'ansible_module_compression' from source: unknown 12081 1726882380.70385: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12081 1726882380.70407: variable 'ansible_facts' from source: unknown 12081 1726882380.70499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325/AnsiballZ_setup.py 12081 1726882380.70605: Sending initial data 12081 1726882380.70609: Sent initial data (153 bytes) 12081 1726882380.71275: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.71281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.71323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.71327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.71330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.71393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882380.71396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882380.71402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882380.71500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882380.73280: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 12081 1726882380.73286: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882380.73381: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 12081 1726882380.73388: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882380.73483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp37q0p6b2 /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325/AnsiballZ_setup.py <<< 12081 1726882380.73583: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882380.75552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882380.75656: stderr chunk (state=3): >>><<< 12081 1726882380.75660: stdout chunk (state=3): >>><<< 12081 1726882380.75680: done transferring module to remote 12081 1726882380.75692: _low_level_execute_command(): starting 12081 1726882380.75697: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325/ /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325/AnsiballZ_setup.py && sleep 0' 12081 1726882380.76146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.76150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.76191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882380.76197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882380.76204: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882380.76209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.76223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.76228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.76295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882380.76300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882380.76403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882380.78154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882380.78203: stderr chunk (state=3): >>><<< 12081 1726882380.78206: stdout chunk (state=3): >>><<< 12081 1726882380.78222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882380.78225: _low_level_execute_command(): starting 12081 1726882380.78230: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325/AnsiballZ_setup.py && sleep 0' 12081 1726882380.78683: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.78688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882380.78724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.78729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882380.78737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882380.78742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882380.78760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882380.78765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882380.78816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882380.78828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882380.78937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882380.80908: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 12081 1726882380.80920: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12081 1726882380.80982: stdout chunk (state=3): >>>import '_io' # <<< 12081 1726882380.80987: stdout chunk (state=3): >>>import 'marshal' # <<< 12081 1726882380.81028: stdout chunk (state=3): >>>import 'posix' # <<< 12081 1726882380.81061: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 12081 1726882380.81069: stdout chunk (state=3): >>># installing zipimport hook <<< 12081 1726882380.81089: stdout chunk (state=3): >>>import 'time' # <<< 12081 1726882380.81108: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12081 1726882380.81155: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py<<< 12081 1726882380.81159: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.81179: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 12081 1726882380.81199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 12081 1726882380.81205: stdout chunk (state=3): >>>import '_codecs' # <<< 12081 1726882380.81227: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e98dc0> <<< 12081 1726882380.81266: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 12081 1726882380.81284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 12081 1726882380.81289: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e98b20> <<< 12081 1726882380.81314: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 12081 1726882380.81331: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e98ac0> <<< 12081 1726882380.81350: stdout chunk (state=3): >>>import '_signal' # <<< 12081 1726882380.81374: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 12081 1726882380.81399: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d490> <<< 12081 1726882380.81416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 12081 1726882380.81430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 12081 1726882380.81433: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 12081 1726882380.81438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 12081 1726882380.81462: stdout chunk (state=3): >>>import '_abc' # <<< 12081 1726882380.81468: stdout chunk (state=3): >>> import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d940> <<< 12081 1726882380.81486: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d670> <<< 12081 1726882380.81517: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 12081 1726882380.81530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 12081 1726882380.81549: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 12081 1726882380.81575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 12081 1726882380.81589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 12081 1726882380.81614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 12081 1726882380.81633: stdout chunk (state=3): >>>import '_stat' # <<< 12081 1726882380.81638: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bcf190> <<< 12081 1726882380.81655: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 12081 1726882380.81679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 12081 1726882380.81752: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bcf220> <<< 12081 1726882380.81782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 12081 1726882380.81818: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 12081 1726882380.81821: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bcf940> <<< 12081 1726882380.81849: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e55880> <<< 12081 1726882380.81871: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 12081 1726882380.81879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bc8d90> <<< 12081 1726882380.81935: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 12081 1726882380.81950: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bf2d90> <<< 12081 1726882380.82002: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d970> <<< 12081 1726882380.82030: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) <<< 12081 1726882380.82034: stdout chunk (state=3): >>> [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12081 1726882380.82369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 12081 1726882380.82379: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 12081 1726882380.82404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 12081 1726882380.82410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 12081 1726882380.82425: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 12081 1726882380.82447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 12081 1726882380.82468: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 12081 1726882380.82482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 12081 1726882380.82491: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b6feb0> <<< 12081 1726882380.82535: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b71f40> <<< 12081 1726882380.82554: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 12081 1726882380.82571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 12081 1726882380.82582: stdout chunk (state=3): >>>import '_sre' # <<< 12081 1726882380.82603: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 12081 1726882380.82619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 12081 1726882380.82641: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 12081 1726882380.82645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 12081 1726882380.82672: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b67610> <<< 12081 1726882380.82684: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b6b640> <<< 12081 1726882380.82697: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b6f370> <<< 12081 1726882380.82714: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 12081 1726882380.82814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 12081 1726882380.82838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.82862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 12081 1726882380.82905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 12081 1726882380.82909: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2a54d90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54880> <<< 12081 1726882380.82924: stdout chunk (state=3): >>>import 'itertools' # <<< 12081 1726882380.82940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 12081 1726882380.82958: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54e80> <<< 12081 1726882380.82973: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 12081 1726882380.82984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 12081 1726882380.83008: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54f40> <<< 12081 1726882380.83081: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 12081 1726882380.83091: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54e50> <<< 12081 1726882380.83105: stdout chunk (state=3): >>>import '_collections' # <<< 12081 1726882380.83116: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b49d00> import '_functools' # <<< 12081 1726882380.83138: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b425e0> <<< 12081 1726882380.83202: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b56640> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b75df0> <<< 12081 1726882380.83225: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 12081 1726882380.83259: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2a66c40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b49220> <<< 12081 1726882380.83329: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.83332: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2b56250> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b7b9a0> <<< 12081 1726882380.83358: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 12081 1726882380.83408: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.83422: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 12081 1726882380.83435: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66f70> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66d60> <<< 12081 1726882380.83449: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66cd0> <<< 12081 1726882380.83477: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 12081 1726882380.83507: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 12081 1726882380.83530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 12081 1726882380.83582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 12081 1726882380.83628: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a3a340> <<< 12081 1726882380.83640: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 12081 1726882380.83673: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a3a430> <<< 12081 1726882380.83871: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a6df70> <<< 12081 1726882380.83883: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a68a00> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a684c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 12081 1726882380.83934: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 12081 1726882380.83938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 12081 1726882380.83965: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 12081 1726882380.83968: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f296d190> <<< 12081 1726882380.83997: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a24cd0> <<< 12081 1726882380.84043: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a68e80> <<< 12081 1726882380.84048: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b75fd0> <<< 12081 1726882380.84087: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 12081 1726882380.84120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 12081 1726882380.84128: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 12081 1726882380.84130: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f297fac0> <<< 12081 1726882380.84137: stdout chunk (state=3): >>>import 'errno' # <<< 12081 1726882380.84170: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f297fdf0> <<< 12081 1726882380.84192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 12081 1726882380.84218: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 12081 1726882380.84232: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2991700> <<< 12081 1726882380.84254: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 12081 1726882380.84290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 12081 1726882380.84313: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2991c40> <<< 12081 1726882380.84345: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.84357: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2929370> <<< 12081 1726882380.84362: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f297fee0> <<< 12081 1726882380.84384: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 12081 1726882380.84389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 12081 1726882380.84427: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f293a250> <<< 12081 1726882380.84441: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2991580> <<< 12081 1726882380.84446: stdout chunk (state=3): >>>import 'pwd' # <<< 12081 1726882380.84477: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f293a310> <<< 12081 1726882380.84516: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a669a0> <<< 12081 1726882380.84531: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 12081 1726882380.84554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 12081 1726882380.84573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 12081 1726882380.84586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 12081 1726882380.84621: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.84629: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955670> <<< 12081 1726882380.84644: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py <<< 12081 1726882380.84648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 12081 1726882380.84675: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.84682: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955940> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2955730> <<< 12081 1726882380.84705: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955820> <<< 12081 1726882380.84732: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 12081 1726882380.84736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 12081 1726882380.84933: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.84940: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955c70> <<< 12081 1726882380.84969: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.84975: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f29641c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f29558b0> <<< 12081 1726882380.84993: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2949a00> <<< 12081 1726882380.85015: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66580> <<< 12081 1726882380.85041: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 12081 1726882380.85099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 12081 1726882380.85138: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2955a60> <<< 12081 1726882380.85284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 12081 1726882380.85298: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb4f2892640> <<< 12081 1726882380.85559: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip' <<< 12081 1726882380.85565: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.85654: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.85678: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 12081 1726882380.85685: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.85705: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.85718: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 12081 1726882380.85724: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.86931: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.87843: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 12081 1726882380.87917: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0790> <<< 12081 1726882380.87947: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.87950: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 12081 1726882380.87952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 12081 1726882380.87954: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 12081 1726882380.87956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 12081 1726882380.87958: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.87960: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27d0160> <<< 12081 1726882380.87983: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0280> <<< 12081 1726882380.88036: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0ee0> <<< 12081 1726882380.88040: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 12081 1726882380.88042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 12081 1726882380.88087: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0fd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0d00> import 'atexit' # <<< 12081 1726882380.88115: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27d0f40> <<< 12081 1726882380.88133: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 12081 1726882380.88157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 12081 1726882380.88213: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0100> <<< 12081 1726882380.88243: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 12081 1726882380.88247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 12081 1726882380.88249: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 12081 1726882380.88271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 12081 1726882380.88294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 12081 1726882380.88382: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27a5160> <<< 12081 1726882380.88440: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21870a0> <<< 12081 1726882380.88445: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2187280> <<< 12081 1726882380.88475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 12081 1726882380.88527: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2187c10> <<< 12081 1726882380.88530: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b6dc0> <<< 12081 1726882380.88704: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b63d0> <<< 12081 1726882380.88720: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 12081 1726882380.88724: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b6f40> <<< 12081 1726882380.88751: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 12081 1726882380.88787: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 12081 1726882380.88816: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 12081 1726882380.88819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 12081 1726882380.88858: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 12081 1726882380.88861: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2805b20> <<< 12081 1726882380.88938: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d7ca0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d7370> <<< 12081 1726882380.88955: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2784bb0> <<< 12081 1726882380.88983: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27d7490> <<< 12081 1726882380.89005: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d74c0> <<< 12081 1726882380.89023: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 12081 1726882380.89044: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 12081 1726882380.89083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 12081 1726882380.89153: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21e5220> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f28171c0> <<< 12081 1726882380.89173: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 12081 1726882380.89192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 12081 1726882380.89239: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21f28e0> <<< 12081 1726882380.89251: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2817340> <<< 12081 1726882380.89267: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 12081 1726882380.89309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.89330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 12081 1726882380.89402: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2817ca0> <<< 12081 1726882380.89528: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f21f2880> <<< 12081 1726882380.89648: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27b0160> <<< 12081 1726882380.89652: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27db9a0> <<< 12081 1726882380.89684: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f28176d0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f280f880> <<< 12081 1726882380.89721: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 12081 1726882380.89752: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 12081 1726882380.89756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 12081 1726882380.89793: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21e79d0> <<< 12081 1726882380.90061: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2726d00> <<< 12081 1726882380.90067: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f21f1640> <<< 12081 1726882380.90169: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21e7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f21f1a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 12081 1726882380.90172: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.90214: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.90247: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 12081 1726882380.90268: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 12081 1726882380.90279: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.90378: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.90469: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.90920: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.91378: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 12081 1726882380.91409: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 12081 1726882380.91424: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 12081 1726882380.91427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.91479: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f274f790> <<< 12081 1726882380.91545: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27547f0> <<< 12081 1726882380.91559: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d779d0> <<< 12081 1726882380.91614: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 12081 1726882380.91617: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.91640: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.91650: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 12081 1726882380.91773: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.91899: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 12081 1726882380.91926: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f278e760> <<< 12081 1726882380.91936: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.92321: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.92694: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.92742: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.92812: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 12081 1726882380.92844: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.92885: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 12081 1726882380.92889: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.92944: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93045: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 12081 1726882380.93071: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 12081 1726882380.93094: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93133: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 12081 1726882380.93137: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93319: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93501: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 12081 1726882380.93540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 12081 1726882380.93615: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d3400> <<< 12081 1726882380.93618: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93677: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93759: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 12081 1726882380.93765: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 12081 1726882380.93795: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93813: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93860: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 12081 1726882380.93863: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93892: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.93926: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94022: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94104: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.94180: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2746a60> <<< 12081 1726882380.94267: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d6f70> <<< 12081 1726882380.94309: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 12081 1726882380.94354: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94422: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94435: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94492: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 12081 1726882380.94496: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 12081 1726882380.94507: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 12081 1726882380.94536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 12081 1726882380.94566: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 12081 1726882380.94580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 12081 1726882380.94653: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2757640> <<< 12081 1726882380.94691: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27a2cd0> <<< 12081 1726882380.94752: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27467f0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 12081 1726882380.94780: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94806: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94809: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 12081 1726882380.94893: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available <<< 12081 1726882380.94921: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 12081 1726882380.94924: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.94972: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95125: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12081 1726882380.95138: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95197: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95204: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 12081 1726882380.95216: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95274: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95334: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95355: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95385: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 12081 1726882380.95388: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95534: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95675: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95703: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.95753: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882380.95780: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 12081 1726882380.95793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 12081 1726882380.95804: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 12081 1726882380.95811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 12081 1726882380.95836: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d9b9d0> <<< 12081 1726882380.95862: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 12081 1726882380.95871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 12081 1726882380.95885: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 12081 1726882380.95916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 12081 1726882380.95933: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 12081 1726882380.95952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 12081 1726882380.95955: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d54b20> <<< 12081 1726882380.95995: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.96000: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f1d54a90> <<< 12081 1726882380.96058: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d89820> <<< 12081 1726882380.96075: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d9bf70> <<< 12081 1726882380.96101: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1af5e20> <<< 12081 1726882380.96106: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d7a670> <<< 12081 1726882380.96125: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 12081 1726882380.96142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 12081 1726882380.96167: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 12081 1726882380.96172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 12081 1726882380.96205: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27b3c70> <<< 12081 1726882380.96220: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d42130> <<< 12081 1726882380.96231: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 12081 1726882380.96248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 12081 1726882380.96270: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b34f0> <<< 12081 1726882380.96292: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 12081 1726882380.96311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 12081 1726882380.96340: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882380.96351: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f1b5df70> <<< 12081 1726882380.96375: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d86d30> <<< 12081 1726882380.96400: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d7a9a0> <<< 12081 1726882380.96409: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 12081 1726882380.96429: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12081 1726882380.96437: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 12081 1726882380.96454: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96506: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96561: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 12081 1726882380.96563: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96600: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96648: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 12081 1726882380.96654: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96676: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 12081 1726882380.96688: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96708: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96742: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 12081 1726882380.96745: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96792: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96836: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 12081 1726882380.96842: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96876: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96915: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 12081 1726882380.96920: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.96982: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.97024: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.97102: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.97136: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 12081 1726882380.97139: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.97526: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.97893: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 12081 1726882380.97933: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.97987: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98008: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98043: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 12081 1726882380.98050: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 12081 1726882380.98056: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98079: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98105: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 12081 1726882380.98115: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98163: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98211: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 12081 1726882380.98218: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98242: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98273: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 12081 1726882380.98279: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98308: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98332: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 12081 1726882380.98335: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98404: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98480: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 12081 1726882380.98507: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1a75df0> <<< 12081 1726882380.98519: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 12081 1726882380.98545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 12081 1726882380.98701: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1a759d0> <<< 12081 1726882380.98705: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 12081 1726882380.98707: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98765: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98820: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 12081 1726882380.98829: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98900: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.98982: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 12081 1726882380.98987: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99043: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99106: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 12081 1726882380.99118: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99147: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99194: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 12081 1726882380.99211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 12081 1726882380.99354: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f1a68b50> <<< 12081 1726882380.99594: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1ab46a0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 12081 1726882380.99597: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99640: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99696: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 12081 1726882380.99772: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99837: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882380.99937: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00084: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 12081 1726882381.00112: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00160: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 12081 1726882381.00167: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00183: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00228: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 12081 1726882381.00279: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f19e2340> <<< 12081 1726882381.00300: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f19e2640> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 12081 1726882381.00319: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 12081 1726882381.00330: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00357: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00404: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 12081 1726882381.00410: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00540: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00666: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 12081 1726882381.00670: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00744: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00826: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00859: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00904: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 12081 1726882381.00907: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 12081 1726882381.00910: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.00980: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.01005: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.01113: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.01241: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 12081 1726882381.01245: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.01356: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.01447: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 12081 1726882381.01477: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.01507: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.01939: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02349: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 12081 1726882381.02352: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 12081 1726882381.02359: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02443: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02532: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 12081 1726882381.02541: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02618: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02701: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 12081 1726882381.02706: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02832: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02979: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 12081 1726882381.02983: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02993: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.02995: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 12081 1726882381.03001: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03033: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03076: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 12081 1726882381.03083: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03163: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03244: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03415: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03596: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 12081 1726882381.03601: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03624: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03669: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 12081 1726882381.03723: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03726: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 12081 1726882381.03729: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03782: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03849: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 12081 1726882381.03891: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03907: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 12081 1726882381.03910: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.03949: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04009: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 12081 1726882381.04012: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04056: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04115: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 12081 1726882381.04118: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04326: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04539: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 12081 1726882381.04545: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04605: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04641: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 12081 1726882381.04653: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04681: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04723: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 12081 1726882381.04754: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04784: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 12081 1726882381.04795: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04817: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04848: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 12081 1726882381.04866: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04945: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.04992: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 12081 1726882381.05013: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12081 1726882381.05027: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 12081 1726882381.05083: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05109: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 12081 1726882381.05125: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05141: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05160: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05193: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05240: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05299: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05365: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 12081 1726882381.05378: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05418: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05467: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 12081 1726882381.05479: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05638: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05792: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 12081 1726882381.05806: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05832: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05877: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 12081 1726882381.05887: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05922: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.05972: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 12081 1726882381.05978: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.06039: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.06111: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 12081 1726882381.06115: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 12081 1726882381.06120: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.06193: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.06267: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 12081 1726882381.06272: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 12081 1726882381.06340: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.06510: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 12081 1726882381.06526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 12081 1726882381.06529: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 12081 1726882381.06532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 12081 1726882381.06562: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882381.06568: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f19eb3d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1b57130> <<< 12081 1726882381.06625: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1866430> <<< 12081 1726882381.07930: stdout chunk (state=3): >>>import 'gc' # <<< 12081 1726882381.14088: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 12081 1726882381.14113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 12081 1726882381.14123: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f19eb490> <<< 12081 1726882381.14141: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 12081 1726882381.14168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 12081 1726882381.14191: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1851dc0> <<< 12081 1726882381.14240: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.14292: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f187c8e0> <<< 12081 1726882381.14299: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f187c430> <<< 12081 1726882381.14582: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 12081 1726882381.14590: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 12081 1726882381.34857: stdout chunk (state=3): >>> <<< 12081 1726882381.34891: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.52, "5m": 0.38, "15m": 0.17}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_R<<< 12081 1726882381.34911: stdout chunk (state=3): >>>EQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5r<<< 12081 1726882381.34932: stdout chunk (state=3): >>>cQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_r<<< 12081 1726882381.34949: stdout chunk (state=3): >>>eceive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checks<<< 12081 1726882381.34972: stdout chunk (state=3): >>>umming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "r<<< 12081 1726882381.34988: stdout chunk (state=3): >>>x_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2826, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 706, "free": 2826}, "nocache": {"free": 3272, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU<<< 12081 1726882381.35012: stdout chunk (state=3): >>>", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 323, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264249925632, "block_size": 4096, "block_total": 65519355, "block_available": 64514142, "block_used": 1005213, "inode_total": 131071472, "inode_available": 130998785, "inode_used": 72687, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "01", "epoch": "1726882381", "epoch_int": "1726882381", "date": "2024-09-20", "time": "21:33:01", "iso8601_micro": "2024-09-21T01:33:01.345077Z", "iso8601": "2024-09-21T01:33:01Z", "iso8601_basic": "20240920T213301345077", "iso8601_basic_short": "20240920T213301", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansib<<< 12081 1726882381.35017: stdout chunk (state=3): >>>le_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}<<< 12081 1726882381.35023: stdout chunk (state=3): >>> <<< 12081 1726882381.35603: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 12081 1726882381.35621: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq <<< 12081 1726882381.35632: stdout chunk (state=3): >>># cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 12081 1726882381.35635: stdout chunk (state=3): >>># cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder <<< 12081 1726882381.35649: stdout chunk (state=3): >>># cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 12081 1726882381.35672: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat <<< 12081 1726882381.35676: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 12081 1726882381.35684: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors <<< 12081 1726882381.35746: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 12081 1726882381.35796: stdout chunk (state=3): >>># cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_uti<<< 12081 1726882381.35812: stdout chunk (state=3): >>>ls.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 12081 1726882381.36081: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12081 1726882381.36102: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 12081 1726882381.36138: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 12081 1726882381.36156: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 12081 1726882381.36173: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 12081 1726882381.36193: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 12081 1726882381.36198: stdout chunk (state=3): >>># destroy encodings <<< 12081 1726882381.36224: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12081 1726882381.36267: stdout chunk (state=3): >>># destroy selinux <<< 12081 1726882381.36272: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 12081 1726882381.36314: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 12081 1726882381.36318: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 12081 1726882381.36345: stdout chunk (state=3): >>># destroy queue <<< 12081 1726882381.36359: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 12081 1726882381.36370: stdout chunk (state=3): >>># destroy shlex <<< 12081 1726882381.36382: stdout chunk (state=3): >>># destroy datetime <<< 12081 1726882381.36389: stdout chunk (state=3): >>># destroy base64 <<< 12081 1726882381.36409: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 12081 1726882381.36418: stdout chunk (state=3): >>># destroy getpass # destroy json <<< 12081 1726882381.36470: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 12081 1726882381.36475: stdout chunk (state=3): >>># destroy glob <<< 12081 1726882381.36481: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection <<< 12081 1726882381.36482: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 12081 1726882381.36513: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep <<< 12081 1726882381.36535: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle <<< 12081 1726882381.36538: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 12081 1726882381.36567: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 12081 1726882381.36592: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 12081 1726882381.36612: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl <<< 12081 1726882381.36631: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 12081 1726882381.36648: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 12081 1726882381.36651: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 12081 1726882381.36702: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 12081 1726882381.36718: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 12081 1726882381.36737: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 12081 1726882381.36740: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 12081 1726882381.36743: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 12081 1726882381.36745: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath <<< 12081 1726882381.36748: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 12081 1726882381.36750: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 12081 1726882381.36758: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 12081 1726882381.36761: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 12081 1726882381.36793: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios <<< 12081 1726882381.36797: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket <<< 12081 1726882381.36802: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 12081 1726882381.36972: stdout chunk (state=3): >>># destroy platform <<< 12081 1726882381.37000: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 12081 1726882381.37006: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 12081 1726882381.37024: stdout chunk (state=3): >>># destroy stat <<< 12081 1726882381.37044: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 12081 1726882381.37048: stdout chunk (state=3): >>># destroy select <<< 12081 1726882381.37061: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 12081 1726882381.37066: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 12081 1726882381.37112: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 12081 1726882381.37453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882381.37510: stderr chunk (state=3): >>><<< 12081 1726882381.37513: stdout chunk (state=3): >>><<< 12081 1726882381.37623: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e98dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e98b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e98ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e55880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2bf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2e3d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b6feb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b71f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b6b640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b6f370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2a54d90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54880> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54e80> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54f40> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a54e50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b49d00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b425e0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b56640> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b75df0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2a66c40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b49220> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2b56250> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b7b9a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66f70> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66d60> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66cd0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a3a340> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a3a430> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a6df70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a68a00> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a684c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f296d190> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a24cd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a68e80> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2b75fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f297fac0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f297fdf0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2991700> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2991c40> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2929370> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f297fee0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f293a250> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2991580> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f293a310> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a669a0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955670> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955940> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2955730> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955820> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2955c70> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f29641c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f29558b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2949a00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2a66580> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2955a60> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb4f2892640> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0790> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27d0160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0ee0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0fd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0d00> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27d0f40> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d0100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27a5160> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21870a0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2187280> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2187c10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b6dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b63d0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b6f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2805b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d7ca0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d7370> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2784bb0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27d7490> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d74c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21e5220> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f28171c0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21f28e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2817340> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2817ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f21f2880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27b0160> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27db9a0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f28176d0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f280f880> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21e79d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2726d00> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f21f1640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f21e7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f21f1a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f274f790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27547f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d779d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f278e760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d3400> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f2746a60> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27d6f70> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f2757640> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27a2cd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27467f0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d9b9d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d54b20> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f1d54a90> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d89820> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d9bf70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1af5e20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d7a670> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f27b3c70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d42130> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f27b34f0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f1b5df70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d86d30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1d7a9a0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1a75df0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1a759d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f1a68b50> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1ab46a0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f19e2340> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f19e2640> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_btbdl4em/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb4f19eb3d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1b57130> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1866430> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f19eb490> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f1851dc0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f187c8e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb4f187c430> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.52, "5m": 0.38, "15m": 0.17}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2826, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 706, "free": 2826}, "nocache": {"free": 3272, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 323, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264249925632, "block_size": 4096, "block_total": 65519355, "block_available": 64514142, "block_used": 1005213, "inode_total": 131071472, "inode_available": 130998785, "inode_used": 72687, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "01", "epoch": "1726882381", "epoch_int": "1726882381", "date": "2024-09-20", "time": "21:33:01", "iso8601_micro": "2024-09-21T01:33:01.345077Z", "iso8601": "2024-09-21T01:33:01Z", "iso8601_basic": "20240920T213301345077", "iso8601_basic_short": "20240920T213301", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 12081 1726882381.40026: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882381.40030: _low_level_execute_command(): starting 12081 1726882381.40032: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882380.2781296-12097-77882765978325/ > /dev/null 2>&1 && sleep 0' 12081 1726882381.40034: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882381.40036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.40038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.40040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.40041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.40044: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882381.40046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.40048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882381.40050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882381.40051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882381.40054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.40055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.40057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.40059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.40061: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882381.40068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.40072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882381.40073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882381.40075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882381.40209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882381.42058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882381.42141: stderr chunk (state=3): >>><<< 12081 1726882381.42144: stdout chunk (state=3): >>><<< 12081 1726882381.42769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882381.42773: handler run complete 12081 1726882381.42775: variable 'ansible_facts' from source: unknown 12081 1726882381.42778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882381.42780: variable 'ansible_facts' from source: unknown 12081 1726882381.42807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882381.42937: attempt loop complete, returning result 12081 1726882381.42947: _execute() done 12081 1726882381.42958: dumping result to json 12081 1726882381.42994: done dumping result, returning 12081 1726882381.43006: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-0a3f-ff3c-000000000015] 12081 1726882381.43013: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000015 12081 1726882381.47501: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000015 12081 1726882381.47505: WORKER PROCESS EXITING ok: [managed_node3] 12081 1726882381.47818: no more pending results, returning what we have 12081 1726882381.47820: results queue empty 12081 1726882381.47821: checking for any_errors_fatal 12081 1726882381.47823: done checking for any_errors_fatal 12081 1726882381.47823: checking for max_fail_percentage 12081 1726882381.47825: done checking for max_fail_percentage 12081 1726882381.47825: checking to see if all hosts have failed and the running result is not ok 12081 1726882381.47826: done checking to see if all hosts have failed 12081 1726882381.47827: getting the remaining hosts for this loop 12081 1726882381.47828: done getting the remaining hosts for this loop 12081 1726882381.47832: getting the next task for host managed_node3 12081 1726882381.47836: done getting next task for host managed_node3 12081 1726882381.47837: ^ task is: TASK: meta (flush_handlers) 12081 1726882381.47861: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882381.47867: getting variables 12081 1726882381.47869: in VariableManager get_vars() 12081 1726882381.47883: Calling all_inventory to load vars for managed_node3 12081 1726882381.47886: Calling groups_inventory to load vars for managed_node3 12081 1726882381.47889: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882381.47896: Calling all_plugins_play to load vars for managed_node3 12081 1726882381.47898: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882381.47901: Calling groups_plugins_play to load vars for managed_node3 12081 1726882381.48064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882381.48250: done with get_vars() 12081 1726882381.48261: done getting variables 12081 1726882381.48315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 12081 1726882381.48367: in VariableManager get_vars() 12081 1726882381.48376: Calling all_inventory to load vars for managed_node3 12081 1726882381.48378: Calling groups_inventory to load vars for managed_node3 12081 1726882381.48380: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882381.48387: Calling all_plugins_play to load vars for managed_node3 12081 1726882381.48392: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882381.48395: Calling groups_plugins_play to load vars for managed_node3 12081 1726882381.48520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882381.48716: done with get_vars() 12081 1726882381.48730: done queuing things up, now waiting for results queue to drain 12081 1726882381.48732: results queue empty 12081 1726882381.48733: checking for any_errors_fatal 12081 1726882381.48736: done checking for any_errors_fatal 12081 1726882381.48736: checking for max_fail_percentage 12081 1726882381.48738: done checking for max_fail_percentage 12081 1726882381.48738: checking to see if all hosts have failed and the running result is not ok 12081 1726882381.48739: done checking to see if all hosts have failed 12081 1726882381.48740: getting the remaining hosts for this loop 12081 1726882381.48741: done getting the remaining hosts for this loop 12081 1726882381.48744: getting the next task for host managed_node3 12081 1726882381.48748: done getting next task for host managed_node3 12081 1726882381.48750: ^ task is: TASK: Include the task 'el_repo_setup.yml' 12081 1726882381.48751: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882381.48754: getting variables 12081 1726882381.48755: in VariableManager get_vars() 12081 1726882381.48765: Calling all_inventory to load vars for managed_node3 12081 1726882381.48767: Calling groups_inventory to load vars for managed_node3 12081 1726882381.48769: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882381.48774: Calling all_plugins_play to load vars for managed_node3 12081 1726882381.48776: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882381.48779: Calling groups_plugins_play to load vars for managed_node3 12081 1726882381.48945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882381.49173: done with get_vars() 12081 1726882381.49188: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:11 Friday 20 September 2024 21:33:01 -0400 (0:00:01.283) 0:00:01.295 ****** 12081 1726882381.49270: entering _queue_task() for managed_node3/include_tasks 12081 1726882381.49272: Creating lock for include_tasks 12081 1726882381.51156: worker is 1 (out of 1 available) 12081 1726882381.51169: exiting _queue_task() for managed_node3/include_tasks 12081 1726882381.51179: done queuing things up, now waiting for results queue to drain 12081 1726882381.51181: waiting for pending results... 12081 1726882381.51794: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 12081 1726882381.51880: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000006 12081 1726882381.51891: variable 'ansible_search_path' from source: unknown 12081 1726882381.51931: calling self._execute() 12081 1726882381.52006: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882381.52018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882381.52035: variable 'omit' from source: magic vars 12081 1726882381.52131: _execute() done 12081 1726882381.52143: dumping result to json 12081 1726882381.52150: done dumping result, returning 12081 1726882381.52161: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000006] 12081 1726882381.52173: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000006 12081 1726882381.52315: no more pending results, returning what we have 12081 1726882381.52320: in VariableManager get_vars() 12081 1726882381.52354: Calling all_inventory to load vars for managed_node3 12081 1726882381.52357: Calling groups_inventory to load vars for managed_node3 12081 1726882381.52361: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882381.52377: Calling all_plugins_play to load vars for managed_node3 12081 1726882381.52381: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882381.52384: Calling groups_plugins_play to load vars for managed_node3 12081 1726882381.52611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882381.52803: done with get_vars() 12081 1726882381.52811: variable 'ansible_search_path' from source: unknown 12081 1726882381.52829: we have included files to process 12081 1726882381.52830: generating all_blocks data 12081 1726882381.52832: done generating all_blocks data 12081 1726882381.52833: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12081 1726882381.52834: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12081 1726882381.52836: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12081 1726882381.53369: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000006 12081 1726882381.53372: WORKER PROCESS EXITING 12081 1726882381.53815: in VariableManager get_vars() 12081 1726882381.53832: done with get_vars() 12081 1726882381.53843: done processing included file 12081 1726882381.53845: iterating over new_blocks loaded from include file 12081 1726882381.53847: in VariableManager get_vars() 12081 1726882381.53855: done with get_vars() 12081 1726882381.53857: filtering new block on tags 12081 1726882381.53877: done filtering new block on tags 12081 1726882381.53880: in VariableManager get_vars() 12081 1726882381.53891: done with get_vars() 12081 1726882381.53892: filtering new block on tags 12081 1726882381.53907: done filtering new block on tags 12081 1726882381.53909: in VariableManager get_vars() 12081 1726882381.53919: done with get_vars() 12081 1726882381.53921: filtering new block on tags 12081 1726882381.53933: done filtering new block on tags 12081 1726882381.53935: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 12081 1726882381.53941: extending task lists for all hosts with included blocks 12081 1726882381.53994: done extending task lists 12081 1726882381.53996: done processing included files 12081 1726882381.53996: results queue empty 12081 1726882381.53997: checking for any_errors_fatal 12081 1726882381.53999: done checking for any_errors_fatal 12081 1726882381.53999: checking for max_fail_percentage 12081 1726882381.54000: done checking for max_fail_percentage 12081 1726882381.54001: checking to see if all hosts have failed and the running result is not ok 12081 1726882381.54002: done checking to see if all hosts have failed 12081 1726882381.54003: getting the remaining hosts for this loop 12081 1726882381.54004: done getting the remaining hosts for this loop 12081 1726882381.54006: getting the next task for host managed_node3 12081 1726882381.54010: done getting next task for host managed_node3 12081 1726882381.54012: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 12081 1726882381.54014: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882381.54016: getting variables 12081 1726882381.54017: in VariableManager get_vars() 12081 1726882381.54025: Calling all_inventory to load vars for managed_node3 12081 1726882381.54027: Calling groups_inventory to load vars for managed_node3 12081 1726882381.54029: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882381.54034: Calling all_plugins_play to load vars for managed_node3 12081 1726882381.54036: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882381.54039: Calling groups_plugins_play to load vars for managed_node3 12081 1726882381.54205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882381.54390: done with get_vars() 12081 1726882381.54399: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:33:01 -0400 (0:00:00.051) 0:00:01.347 ****** 12081 1726882381.54471: entering _queue_task() for managed_node3/setup 12081 1726882381.54728: worker is 1 (out of 1 available) 12081 1726882381.54743: exiting _queue_task() for managed_node3/setup 12081 1726882381.54753: done queuing things up, now waiting for results queue to drain 12081 1726882381.54755: waiting for pending results... 12081 1726882381.54985: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 12081 1726882381.55080: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000026 12081 1726882381.55101: variable 'ansible_search_path' from source: unknown 12081 1726882381.55108: variable 'ansible_search_path' from source: unknown 12081 1726882381.55148: calling self._execute() 12081 1726882381.55228: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882381.55239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882381.55252: variable 'omit' from source: magic vars 12081 1726882381.55787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882381.59275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882381.59367: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882381.59416: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882381.59457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882381.59498: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882381.59583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882381.59623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882381.59655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882381.59712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882381.59732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882381.59929: variable 'ansible_facts' from source: unknown 12081 1726882381.60011: variable 'network_test_required_facts' from source: task vars 12081 1726882381.60058: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 12081 1726882381.60073: variable 'omit' from source: magic vars 12081 1726882381.60120: variable 'omit' from source: magic vars 12081 1726882381.60161: variable 'omit' from source: magic vars 12081 1726882381.60194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882381.60225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882381.60250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882381.60274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882381.60287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882381.60318: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882381.60329: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882381.60338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882381.60439: Set connection var ansible_pipelining to False 12081 1726882381.60448: Set connection var ansible_shell_type to sh 12081 1726882381.60466: Set connection var ansible_shell_executable to /bin/sh 12081 1726882381.60475: Set connection var ansible_connection to ssh 12081 1726882381.60486: Set connection var ansible_timeout to 10 12081 1726882381.60495: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882381.60522: variable 'ansible_shell_executable' from source: unknown 12081 1726882381.60529: variable 'ansible_connection' from source: unknown 12081 1726882381.60535: variable 'ansible_module_compression' from source: unknown 12081 1726882381.60540: variable 'ansible_shell_type' from source: unknown 12081 1726882381.60545: variable 'ansible_shell_executable' from source: unknown 12081 1726882381.60556: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882381.60564: variable 'ansible_pipelining' from source: unknown 12081 1726882381.60575: variable 'ansible_timeout' from source: unknown 12081 1726882381.60583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882381.60730: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882381.60745: variable 'omit' from source: magic vars 12081 1726882381.60754: starting attempt loop 12081 1726882381.60760: running the handler 12081 1726882381.60782: _low_level_execute_command(): starting 12081 1726882381.60798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882381.62541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.62545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.62569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.62573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.62590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882381.62592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.62636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882381.63288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882381.63291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882381.63412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882381.65117: stdout chunk (state=3): >>>/root <<< 12081 1726882381.65200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882381.65298: stderr chunk (state=3): >>><<< 12081 1726882381.65302: stdout chunk (state=3): >>><<< 12081 1726882381.65423: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882381.65435: _low_level_execute_command(): starting 12081 1726882381.65438: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941 `" && echo ansible-tmp-1726882381.6532228-12150-178478077324941="` echo /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941 `" ) && sleep 0' 12081 1726882381.67235: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.67240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.67270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.67276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.67279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.67282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.67991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882381.68698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882381.68702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882381.68835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882381.70729: stdout chunk (state=3): >>>ansible-tmp-1726882381.6532228-12150-178478077324941=/root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941 <<< 12081 1726882381.70910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882381.70914: stdout chunk (state=3): >>><<< 12081 1726882381.70922: stderr chunk (state=3): >>><<< 12081 1726882381.70941: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882381.6532228-12150-178478077324941=/root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882381.70994: variable 'ansible_module_compression' from source: unknown 12081 1726882381.71050: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12081 1726882381.71114: variable 'ansible_facts' from source: unknown 12081 1726882381.71272: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941/AnsiballZ_setup.py 12081 1726882381.71423: Sending initial data 12081 1726882381.71428: Sent initial data (154 bytes) 12081 1726882381.72384: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882381.72388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.72394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.72410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.72447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.72466: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882381.72481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.72500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882381.72514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882381.72526: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882381.72540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.72556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.72576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.72588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.72599: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882381.72613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.72699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882381.72716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882381.72731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882381.72873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882381.74630: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882381.74731: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882381.74833: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpry6f0vd6 /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941/AnsiballZ_setup.py <<< 12081 1726882381.74931: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882381.77957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882381.78174: stderr chunk (state=3): >>><<< 12081 1726882381.78178: stdout chunk (state=3): >>><<< 12081 1726882381.78181: done transferring module to remote 12081 1726882381.78183: _low_level_execute_command(): starting 12081 1726882381.78186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941/ /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941/AnsiballZ_setup.py && sleep 0' 12081 1726882381.79061: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882381.79082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.79102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.79122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.79188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.79219: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882381.79235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.79258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882381.79278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882381.79296: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882381.79308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.79325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.79344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.79371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.79386: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882381.79406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.79494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882381.79522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882381.79537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882381.80077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882381.81444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882381.81514: stderr chunk (state=3): >>><<< 12081 1726882381.81517: stdout chunk (state=3): >>><<< 12081 1726882381.81612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882381.81616: _low_level_execute_command(): starting 12081 1726882381.81618: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941/AnsiballZ_setup.py && sleep 0' 12081 1726882381.82508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882381.82521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.82540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.82560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.82602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.82616: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882381.82628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.82644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882381.82654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882381.82667: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882381.82678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882381.82690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882381.82703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882381.82716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882381.82727: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882381.82739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882381.82815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882381.82839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882381.82854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882381.82990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882381.84969: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 12081 1726882381.84973: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12081 1726882381.85023: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12081 1726882381.85082: stdout chunk (state=3): >>>import 'posix' # <<< 12081 1726882381.85096: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12081 1726882381.85140: stdout chunk (state=3): >>>import 'time' # <<< 12081 1726882381.85143: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12081 1726882381.85223: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.85233: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 12081 1726882381.85245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 12081 1726882381.85274: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3f3dc0> <<< 12081 1726882381.85311: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 12081 1726882381.85341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3f3b20> <<< 12081 1726882381.85359: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 12081 1726882381.85378: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3f3ac0> <<< 12081 1726882381.85400: stdout chunk (state=3): >>>import '_signal' # <<< 12081 1726882381.85425: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 12081 1726882381.85428: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398490> <<< 12081 1726882381.85450: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 12081 1726882381.85486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 12081 1726882381.85499: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398940> <<< 12081 1726882381.85518: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398670> <<< 12081 1726882381.85570: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 12081 1726882381.85574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 12081 1726882381.85593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 12081 1726882381.85609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 12081 1726882381.85628: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 12081 1726882381.85652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 12081 1726882381.85673: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c34f190> <<< 12081 1726882381.85692: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 12081 1726882381.85706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 12081 1726882381.85785: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c34f220> <<< 12081 1726882381.85810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 12081 1726882381.85850: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c372850> <<< 12081 1726882381.85853: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c34f940> <<< 12081 1726882381.85912: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3b0880> <<< 12081 1726882381.85917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c348d90> <<< 12081 1726882381.85979: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 12081 1726882381.85982: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c372d90> <<< 12081 1726882381.86032: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398970> <<< 12081 1726882381.86070: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12081 1726882381.86404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 12081 1726882381.86442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 12081 1726882381.86445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 12081 1726882381.86479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 12081 1726882381.86482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 12081 1726882381.86512: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 12081 1726882381.86516: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2eeeb0> <<< 12081 1726882381.86568: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2f1f40> <<< 12081 1726882381.86594: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 12081 1726882381.86606: stdout chunk (state=3): >>>import '_sre' # <<< 12081 1726882381.86635: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 12081 1726882381.86664: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 12081 1726882381.86706: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2e7610> <<< 12081 1726882381.86710: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2ed640> <<< 12081 1726882381.86726: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2ee370> <<< 12081 1726882381.86742: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 12081 1726882381.86809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 12081 1726882381.86827: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 12081 1726882381.86866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.86886: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 12081 1726882381.86924: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bf90df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf908e0> <<< 12081 1726882381.86944: stdout chunk (state=3): >>>import 'itertools' # <<< 12081 1726882381.86970: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf90ee0> <<< 12081 1726882381.86983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 12081 1726882381.87001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 12081 1726882381.87024: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf90fa0> <<< 12081 1726882381.87065: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 12081 1726882381.87079: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf90eb0> <<< 12081 1726882381.87082: stdout chunk (state=3): >>>import '_collections' # <<< 12081 1726882381.87122: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2c9d60> <<< 12081 1726882381.87133: stdout chunk (state=3): >>>import '_functools' # <<< 12081 1726882381.87154: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2c2640> <<< 12081 1726882381.87214: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2d56a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2f5df0> <<< 12081 1726882381.87239: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 12081 1726882381.87273: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bfa3ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2c9280> <<< 12081 1726882381.87318: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882381.87335: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1c2d52b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2fb9a0> <<< 12081 1726882381.87353: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 12081 1726882381.87376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 12081 1726882381.87393: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.87422: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 12081 1726882381.87435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3dc0> <<< 12081 1726882381.87463: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 12081 1726882381.87491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 12081 1726882381.87517: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 12081 1726882381.87527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 12081 1726882381.87546: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 12081 1726882381.87597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 12081 1726882381.87627: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf763a0> <<< 12081 1726882381.87650: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 12081 1726882381.87662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 12081 1726882381.87689: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf76490> <<< 12081 1726882381.87814: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfaafd0> <<< 12081 1726882381.87855: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa5a60> <<< 12081 1726882381.87888: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa5580> <<< 12081 1726882381.87903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 12081 1726882381.87944: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 12081 1726882381.87977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 12081 1726882381.87988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bec41f0> <<< 12081 1726882381.88013: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf61b80> <<< 12081 1726882381.88070: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa5ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2f5fd0> <<< 12081 1726882381.88092: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 12081 1726882381.88135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 12081 1726882381.88153: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bed6b20> <<< 12081 1726882381.88166: stdout chunk (state=3): >>>import 'errno' # <<< 12081 1726882381.88195: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bed6e50> <<< 12081 1726882381.88214: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 12081 1726882381.88246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 12081 1726882381.88260: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bee8760> <<< 12081 1726882381.88282: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 12081 1726882381.88307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 12081 1726882381.88341: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bee8ca0> <<< 12081 1726882381.88384: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so'<<< 12081 1726882381.88402: stdout chunk (state=3): >>> # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1be803d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bed6f40> <<< 12081 1726882381.88419: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 12081 1726882381.88474: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1be912b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bee85e0> <<< 12081 1726882381.88486: stdout chunk (state=3): >>>import 'pwd' # <<< 12081 1726882381.88509: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1be91370> <<< 12081 1726882381.88540: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3a00> <<< 12081 1726882381.88575: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 12081 1726882381.88586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 12081 1726882381.88609: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 12081 1726882381.88619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 12081 1726882381.88647: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beac6d0> <<< 12081 1726882381.88676: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 12081 1726882381.88702: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beac9a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1beac790> <<< 12081 1726882381.88730: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beac880> <<< 12081 1726882381.88756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 12081 1726882381.88956: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beaccd0> <<< 12081 1726882381.88993: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beb9220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1beac910> <<< 12081 1726882381.89018: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bea0a60> <<< 12081 1726882381.89036: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa35e0> <<< 12081 1726882381.89054: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 12081 1726882381.89124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 12081 1726882381.89151: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1beacac0> <<< 12081 1726882381.89302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 12081 1726882381.89313: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6f1bdd46a0> <<< 12081 1726882381.89581: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip' # zipimport: zlib available <<< 12081 1726882381.89677: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.89713: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/__init__.py <<< 12081 1726882381.89739: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 12081 1726882381.89751: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.90957: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.91888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec7f0> <<< 12081 1726882381.91922: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 12081 1726882381.91950: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 12081 1726882381.91972: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7ec160> <<< 12081 1726882381.91995: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec280> <<< 12081 1726882381.92025: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ecf40> <<< 12081 1726882381.92051: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 12081 1726882381.92094: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ecd60> import 'atexit' # <<< 12081 1726882381.92125: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7ecfa0> <<< 12081 1726882381.92144: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 12081 1726882381.92166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 12081 1726882381.92205: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec100> <<< 12081 1726882381.92223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 12081 1726882381.92241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 12081 1726882381.92258: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 12081 1726882381.92278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 12081 1726882381.92307: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 12081 1726882381.92317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 12081 1726882381.92388: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7c3100> <<< 12081 1726882381.92426: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b6c8100> <<< 12081 1726882381.92451: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b6c82e0> <<< 12081 1726882381.92476: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 12081 1726882381.92519: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b6c8c70> <<< 12081 1726882381.92534: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7d2dc0> <<< 12081 1726882381.92698: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7d23a0> <<< 12081 1726882381.92723: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 12081 1726882381.92751: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7d2fa0> <<< 12081 1726882381.92765: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 12081 1726882381.92794: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 12081 1726882381.92824: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 12081 1726882381.92859: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd48c70> <<< 12081 1726882381.92933: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ced00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ce3d0> <<< 12081 1726882381.92945: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7a1b50> <<< 12081 1726882381.92985: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7ce4f0> <<< 12081 1726882381.92997: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ce520> <<< 12081 1726882381.93020: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 12081 1726882381.93037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 12081 1726882381.93049: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 12081 1726882381.93084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 12081 1726882381.93150: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b724310> <<< 12081 1726882381.93175: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd5a220> <<< 12081 1726882381.93192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 12081 1726882381.93236: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882381.93253: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b730880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd5a3a0> <<< 12081 1726882381.93266: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 12081 1726882381.93305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.93327: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 12081 1726882381.93384: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd5aca0> <<< 12081 1726882381.93520: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b730820> <<< 12081 1726882381.93607: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7cdaf0> <<< 12081 1726882381.93639: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bd5a940> <<< 12081 1726882381.93681: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bd5a5b0> <<< 12081 1726882381.93701: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd528e0> <<< 12081 1726882381.93720: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 12081 1726882381.93744: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 12081 1726882381.93787: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b726970> <<< 12081 1726882381.93977: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b743d60> <<< 12081 1726882381.94003: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b72f5e0> <<< 12081 1726882381.94027: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b726f10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b72f9d0> # zipimport: zlib available <<< 12081 1726882381.94054: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 12081 1726882381.94075: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.94133: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.94208: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.94232: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 12081 1726882381.94254: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 12081 1726882381.94269: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.94360: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.94453: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.94900: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.95351: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 12081 1726882381.95358: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 12081 1726882381.95372: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 12081 1726882381.95383: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 12081 1726882381.95392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.95446: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882381.95452: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7427f0> <<< 12081 1726882381.95523: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 12081 1726882381.95526: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b77d880> <<< 12081 1726882381.95531: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2bc9a0> <<< 12081 1726882381.95607: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 12081 1726882381.95619: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.95623: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.95626: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 12081 1726882381.95628: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.95749: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.95875: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 12081 1726882381.95897: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7a9730> <<< 12081 1726882381.95902: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.96297: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.96659: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.96712: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.96783: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 12081 1726882381.96786: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.96814: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.96859: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available<<< 12081 1726882381.96862: stdout chunk (state=3): >>> <<< 12081 1726882381.96909: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.96990: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 12081 1726882381.97017: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 12081 1726882381.97022: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97045: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97094: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 12081 1726882381.97097: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97278: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97480: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 12081 1726882381.97497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 12081 1726882381.97572: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ef3a0> # zipimport: zlib available <<< 12081 1726882381.97630: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97714: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 12081 1726882381.97724: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 12081 1726882381.97736: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97772: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97813: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 12081 1726882381.97821: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97839: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97881: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.97976: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98034: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 12081 1726882381.98053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.98125: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b761610> <<< 12081 1726882381.98207: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b14db50> <<< 12081 1726882381.98243: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 12081 1726882381.98256: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98301: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98359: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98379: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98418: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 12081 1726882381.98443: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 12081 1726882381.98447: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 12081 1726882381.98488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 12081 1726882381.98499: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 12081 1726882381.98521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 12081 1726882381.98596: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7746a0> <<< 12081 1726882381.98638: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7c0e50> <<< 12081 1726882381.98697: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ef850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 12081 1726882381.98725: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98748: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 12081 1726882381.98830: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 12081 1726882381.98849: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 12081 1726882381.98868: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98917: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98976: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.98987: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99009: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99038: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99077: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99106: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99142: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 12081 1726882381.99154: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99210: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99285: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99297: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99326: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 12081 1726882381.99476: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99610: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99642: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882381.99684: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882381.99772: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 12081 1726882381.99811: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2ba6d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 12081 1726882381.99823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 12081 1726882381.99846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 12081 1726882381.99882: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 12081 1726882381.99898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b29ea30> <<< 12081 1726882381.99925: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b29e9a0> <<< 12081 1726882382.00002: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2ce040> <<< 12081 1726882382.00025: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2ba520> <<< 12081 1726882382.00038: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b037fa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b037be0> <<< 12081 1726882382.00074: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 12081 1726882382.00100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 12081 1726882382.00125: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 12081 1726882382.00144: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7cfd00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b27ee80> <<< 12081 1726882382.00173: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 12081 1726882382.00190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 12081 1726882382.00215: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7cf0d0> <<< 12081 1726882382.00233: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 12081 1726882382.00244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 12081 1726882382.00274: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b0a0fd0> <<< 12081 1726882382.00314: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2cce50> <<< 12081 1726882382.00350: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b037e50> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 12081 1726882382.00375: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 12081 1726882382.00385: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.00438: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.00487: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 12081 1726882382.00501: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.00529: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.00587: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 12081 1726882382.00607: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 12081 1726882382.00618: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.00634: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.00672: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 12081 1726882382.00714: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.01224: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 12081 1726882382.01440: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.01800: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 12081 1726882382.01843: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.01893: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.01916: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.01962: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 12081 1726882382.01966: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.01992: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02017: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 12081 1726882382.02073: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02131: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 12081 1726882382.02136: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02149: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02180: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 12081 1726882382.02225: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02237: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 12081 1726882382.02318: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 12081 1726882382.02399: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1afb6e50> <<< 12081 1726882382.02423: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 12081 1726882382.02445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 12081 1726882382.02609: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1afb69d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 12081 1726882382.02612: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02668: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02725: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 12081 1726882382.02802: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02895: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 12081 1726882382.02898: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.02941: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.03012: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 12081 1726882382.03043: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.03117: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 12081 1726882382.03120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 12081 1726882382.03259: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.03263: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1afae790> <<< 12081 1726882382.03490: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2bf7f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 12081 1726882382.03539: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.03596: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 12081 1726882382.03600: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.03672: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.03734: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.03826: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.03969: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 12081 1726882382.03972: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04003: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04040: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 12081 1726882382.04043: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04078: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04123: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 12081 1726882382.04178: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.04201: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1af74310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1af74340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 12081 1726882382.04217: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04248: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04297: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 12081 1726882382.04300: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04429: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04557: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 12081 1726882382.04636: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04719: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04751: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04799: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 12081 1726882382.04806: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04891: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.04894: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.05009: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.05135: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 12081 1726882382.06000: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12081 1726882382.06231: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 12081 1726882382.06244: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.06325: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.06415: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 12081 1726882382.06499: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.06583: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 12081 1726882382.06594: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.06717: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.06854: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 12081 1726882382.06881: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 12081 1726882382.06911: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.06965: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 12081 1726882382.06969: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07043: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07123: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07294: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07469: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 12081 1726882382.07477: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07497: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07550: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 12081 1726882382.07553: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07578: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07592: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 12081 1726882382.07650: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07725: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 12081 1726882382.07728: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07755: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07773: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 12081 1726882382.07819: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07876: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 12081 1726882382.07923: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.07980: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 12081 1726882382.08192: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08411: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 12081 1726882382.08459: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08515: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 12081 1726882382.08572: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08586: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 12081 1726882382.08618: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08647: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 12081 1726882382.08680: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08719: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 12081 1726882382.08722: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08784: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08865: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 12081 1726882382.08897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 12081 1726882382.08900: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08927: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.08982: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 12081 1726882382.09005: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09020: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09049: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09102: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09150: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09223: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 12081 1726882382.09239: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09275: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09329: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 12081 1726882382.09333: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09486: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09647: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 12081 1726882382.09690: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09743: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 12081 1726882382.09746: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09780: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09827: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 12081 1726882382.09830: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09895: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.09978: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 12081 1726882382.09981: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.10043: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.10125: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 12081 1726882382.10194: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.10976: stdout chunk (state=3): >>>import 'gc' # <<< 12081 1726882382.11301: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 12081 1726882382.11326: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 12081 1726882382.11581: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1af97c40> <<< 12081 1726882382.11585: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1af267f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1af26940> <<< 12081 1726882382.13671: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "02", "epoch": "1726882382", "epoch_int": "1726882382", "date": "2024-09-20", "time": "21:33:02", "iso8601_micro": "2024-09-21T01:33:02.111515Z", "iso8601": "2024-09-21T01:33:02Z", "iso8601_basic": "20240920T213302111515", "iso8601_basic_short": "20240920T213302", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12081 1726882382.14233: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap <<< 12081 1726882382.14310: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 12081 1726882382.14602: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12081 1726882382.14613: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 12081 1726882382.14678: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 12081 1726882382.14712: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 12081 1726882382.14715: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12081 1726882382.14752: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 12081 1726882382.14812: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 12081 1726882382.14877: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 <<< 12081 1726882382.14919: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 12081 1726882382.14982: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl <<< 12081 1726882382.15041: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select <<< 12081 1726882382.15094: stdout chunk (state=3): >>># cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 12081 1726882382.15136: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 12081 1726882382.15204: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 12081 1726882382.15208: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 12081 1726882382.15279: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 12081 1726882382.15417: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 12081 1726882382.15451: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat <<< 12081 1726882382.15506: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 12081 1726882382.15509: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 12081 1726882382.15539: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 12081 1726882382.15906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882382.15909: stdout chunk (state=3): >>><<< 12081 1726882382.15911: stderr chunk (state=3): >>><<< 12081 1726882382.16053: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3f3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3f3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c34f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c34f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c372850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c34f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c3b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c348d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c372d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c398970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2eeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2f1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2e7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2ed640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2ee370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bf90df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf908e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf90ee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf90fa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf90eb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2c9d60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2c2640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2d56a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2f5df0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bfa3ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2c9280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1c2d52b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2fb9a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf763a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf76490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfaafd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa5a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa5580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bec41f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bf61b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa5ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1c2f5fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bed6b20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bed6e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bee8760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bee8ca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1be803d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bed6f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1be912b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bee85e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1be91370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa3a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beac6d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beac9a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1beac790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beac880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beaccd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1beb9220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1beac910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bea0a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bfa35e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1beacac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6f1bdd46a0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec7f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7ec160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ecf40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ecd60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7ecfa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ec100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7c3100> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b6c8100> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b6c82e0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b6c8c70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7d2dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7d23a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7d2fa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd48c70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ced00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ce3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7a1b50> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7ce4f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ce520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b724310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd5a220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b730880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd5a3a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd5aca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b730820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7cdaf0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bd5a940> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1bd5a5b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1bd528e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b726970> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b743d60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b72f5e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b726f10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b72f9d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7427f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b77d880> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2bc9a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7a9730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ef3a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b761610> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b14db50> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7746a0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7c0e50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7ef850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2ba6d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b29ea30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b29e9a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2ce040> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2ba520> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b037fa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b037be0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b7cfd00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b27ee80> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b7cf0d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1b0a0fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2cce50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b037e50> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1afb6e50> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1afb69d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1afae790> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1b2bf7f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1af74310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1af74340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_uklgmyig/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6f1af97c40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1af267f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6f1af26940> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "02", "epoch": "1726882382", "epoch_int": "1726882382", "date": "2024-09-20", "time": "21:33:02", "iso8601_micro": "2024-09-21T01:33:02.111515Z", "iso8601": "2024-09-21T01:33:02Z", "iso8601_basic": "20240920T213302111515", "iso8601_basic_short": "20240920T213302", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 12081 1726882382.17285: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882382.17289: _low_level_execute_command(): starting 12081 1726882382.17292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882381.6532228-12150-178478077324941/ > /dev/null 2>&1 && sleep 0' 12081 1726882382.17804: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882382.17823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.17838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.17861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.17908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.17925: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882382.17941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.17962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882382.17978: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882382.17990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882382.18002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.18015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.18036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.18052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.18070: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882382.18084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.18166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882382.18186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882382.18219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882382.18406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882382.20232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882382.20236: stdout chunk (state=3): >>><<< 12081 1726882382.20240: stderr chunk (state=3): >>><<< 12081 1726882382.20258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882382.20266: handler run complete 12081 1726882382.20313: variable 'ansible_facts' from source: unknown 12081 1726882382.20387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.20501: variable 'ansible_facts' from source: unknown 12081 1726882382.20546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.20599: attempt loop complete, returning result 12081 1726882382.20603: _execute() done 12081 1726882382.20605: dumping result to json 12081 1726882382.20616: done dumping result, returning 12081 1726882382.20624: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-0a3f-ff3c-000000000026] 12081 1726882382.20631: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000026 12081 1726882382.20791: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000026 12081 1726882382.20793: WORKER PROCESS EXITING ok: [managed_node3] 12081 1726882382.20901: no more pending results, returning what we have 12081 1726882382.20905: results queue empty 12081 1726882382.20906: checking for any_errors_fatal 12081 1726882382.20908: done checking for any_errors_fatal 12081 1726882382.20909: checking for max_fail_percentage 12081 1726882382.20911: done checking for max_fail_percentage 12081 1726882382.20912: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.20913: done checking to see if all hosts have failed 12081 1726882382.20913: getting the remaining hosts for this loop 12081 1726882382.20915: done getting the remaining hosts for this loop 12081 1726882382.20920: getting the next task for host managed_node3 12081 1726882382.20930: done getting next task for host managed_node3 12081 1726882382.20932: ^ task is: TASK: Check if system is ostree 12081 1726882382.20935: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.20939: getting variables 12081 1726882382.20940: in VariableManager get_vars() 12081 1726882382.20975: Calling all_inventory to load vars for managed_node3 12081 1726882382.20978: Calling groups_inventory to load vars for managed_node3 12081 1726882382.20981: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.20992: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.20994: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.20997: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.21192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.21403: done with get_vars() 12081 1726882382.21416: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:33:02 -0400 (0:00:00.676) 0:00:02.023 ****** 12081 1726882382.22101: entering _queue_task() for managed_node3/stat 12081 1726882382.22401: worker is 1 (out of 1 available) 12081 1726882382.22413: exiting _queue_task() for managed_node3/stat 12081 1726882382.22427: done queuing things up, now waiting for results queue to drain 12081 1726882382.22428: waiting for pending results... 12081 1726882382.23487: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 12081 1726882382.23582: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000028 12081 1726882382.23586: variable 'ansible_search_path' from source: unknown 12081 1726882382.23590: variable 'ansible_search_path' from source: unknown 12081 1726882382.23624: calling self._execute() 12081 1726882382.23697: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.23701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.23710: variable 'omit' from source: magic vars 12081 1726882382.24290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882382.24573: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882382.24653: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882382.24693: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882382.24735: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882382.24835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882382.24873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882382.24906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882382.24942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882382.25079: Evaluated conditional (not __network_is_ostree is defined): True 12081 1726882382.25092: variable 'omit' from source: magic vars 12081 1726882382.25134: variable 'omit' from source: magic vars 12081 1726882382.25184: variable 'omit' from source: magic vars 12081 1726882382.25219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882382.25251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882382.25280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882382.25305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882382.25319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882382.25353: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882382.25364: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.25377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.25489: Set connection var ansible_pipelining to False 12081 1726882382.25497: Set connection var ansible_shell_type to sh 12081 1726882382.25509: Set connection var ansible_shell_executable to /bin/sh 12081 1726882382.25520: Set connection var ansible_connection to ssh 12081 1726882382.25530: Set connection var ansible_timeout to 10 12081 1726882382.25539: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882382.25572: variable 'ansible_shell_executable' from source: unknown 12081 1726882382.25580: variable 'ansible_connection' from source: unknown 12081 1726882382.25592: variable 'ansible_module_compression' from source: unknown 12081 1726882382.25598: variable 'ansible_shell_type' from source: unknown 12081 1726882382.25604: variable 'ansible_shell_executable' from source: unknown 12081 1726882382.25610: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.25617: variable 'ansible_pipelining' from source: unknown 12081 1726882382.25626: variable 'ansible_timeout' from source: unknown 12081 1726882382.25633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.25785: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882382.25801: variable 'omit' from source: magic vars 12081 1726882382.25813: starting attempt loop 12081 1726882382.25819: running the handler 12081 1726882382.25835: _low_level_execute_command(): starting 12081 1726882382.25852: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882382.26814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882382.26840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.26865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.26884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.26932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.26951: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882382.26974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.26993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882382.27004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882382.27014: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882382.27029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.27054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.27081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.27100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.27114: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882382.27129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.27222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882382.27247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882382.27282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882382.27426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882382.29027: stdout chunk (state=3): >>>/root <<< 12081 1726882382.29231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882382.29234: stdout chunk (state=3): >>><<< 12081 1726882382.29239: stderr chunk (state=3): >>><<< 12081 1726882382.29368: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882382.29379: _low_level_execute_command(): starting 12081 1726882382.29382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415 `" && echo ansible-tmp-1726882382.2926514-12190-157183584159415="` echo /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415 `" ) && sleep 0' 12081 1726882382.29974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882382.29987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.30000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.30024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.30069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.30080: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882382.30092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.30107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882382.30125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882382.30134: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882382.30144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.30158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.30175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.30185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.30194: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882382.30204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.30285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882382.30304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882382.30316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882382.30461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882382.32324: stdout chunk (state=3): >>>ansible-tmp-1726882382.2926514-12190-157183584159415=/root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415 <<< 12081 1726882382.32485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882382.32533: stderr chunk (state=3): >>><<< 12081 1726882382.32537: stdout chunk (state=3): >>><<< 12081 1726882382.32774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882382.2926514-12190-157183584159415=/root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882382.32778: variable 'ansible_module_compression' from source: unknown 12081 1726882382.32781: ANSIBALLZ: Using lock for stat 12081 1726882382.32783: ANSIBALLZ: Acquiring lock 12081 1726882382.32785: ANSIBALLZ: Lock acquired: 139893497835696 12081 1726882382.32787: ANSIBALLZ: Creating module 12081 1726882382.44994: ANSIBALLZ: Writing module into payload 12081 1726882382.45111: ANSIBALLZ: Writing module 12081 1726882382.45136: ANSIBALLZ: Renaming module 12081 1726882382.45147: ANSIBALLZ: Done creating module 12081 1726882382.45169: variable 'ansible_facts' from source: unknown 12081 1726882382.45238: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415/AnsiballZ_stat.py 12081 1726882382.45423: Sending initial data 12081 1726882382.45432: Sent initial data (153 bytes) 12081 1726882382.46243: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882382.46260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.46283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.46303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.46346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.46361: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882382.46378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.46396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882382.46407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882382.46418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882382.46430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.46443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.46458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.46473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.46484: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882382.46497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.46572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882382.46594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882382.46608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882382.46766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882382.48580: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882382.48682: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882382.48801: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpdsk23xym /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415/AnsiballZ_stat.py <<< 12081 1726882382.48889: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882382.50325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882382.50589: stderr chunk (state=3): >>><<< 12081 1726882382.50592: stdout chunk (state=3): >>><<< 12081 1726882382.50595: done transferring module to remote 12081 1726882382.50597: _low_level_execute_command(): starting 12081 1726882382.50599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415/ /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415/AnsiballZ_stat.py && sleep 0' 12081 1726882382.51624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.51628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.51662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882382.51667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.51671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.51729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882382.52589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882382.52596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882382.52704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882382.54474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882382.54554: stderr chunk (state=3): >>><<< 12081 1726882382.54558: stdout chunk (state=3): >>><<< 12081 1726882382.54663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882382.54668: _low_level_execute_command(): starting 12081 1726882382.54671: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415/AnsiballZ_stat.py && sleep 0' 12081 1726882382.55919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.55923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.55959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882382.55962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.55966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.56022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882382.56333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882382.56336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882382.56678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882382.58438: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 12081 1726882382.58443: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12081 1726882382.58489: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12081 1726882382.58526: stdout chunk (state=3): >>>import 'posix' # <<< 12081 1726882382.58562: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 12081 1726882382.58570: stdout chunk (state=3): >>># installing zipimport hook <<< 12081 1726882382.58596: stdout chunk (state=3): >>>import 'time' # <<< 12081 1726882382.58599: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12081 1726882382.58653: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.58703: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 12081 1726882382.58706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 12081 1726882382.58709: stdout chunk (state=3): >>>import '_codecs' # <<< 12081 1726882382.58720: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c4243dc0> <<< 12081 1726882382.58768: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 12081 1726882382.58772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 12081 1726882382.58808: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c4243b20> <<< 12081 1726882382.58818: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 12081 1726882382.58834: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c4243ac0> import '_signal' # <<< 12081 1726882382.58869: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 12081 1726882382.58940: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8490> <<< 12081 1726882382.58944: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 12081 1726882382.58946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 12081 1726882382.58952: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 12081 1726882382.58955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 12081 1726882382.58957: stdout chunk (state=3): >>>import '_abc' # <<< 12081 1726882382.58971: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8670> <<< 12081 1726882382.59001: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 12081 1726882382.59004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 12081 1726882382.59030: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 12081 1726882382.59047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 12081 1726882382.59079: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 12081 1726882382.59086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 12081 1726882382.59102: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f8f190> <<< 12081 1726882382.59126: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 12081 1726882382.59139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 12081 1726882382.59225: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f8f220> <<< 12081 1726882382.59254: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 12081 1726882382.59260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 12081 1726882382.59285: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f8f940> <<< 12081 1726882382.59315: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ff0880> <<< 12081 1726882382.59339: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 12081 1726882382.59342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f88d90> <<< 12081 1726882382.59402: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fb2d90> <<< 12081 1726882382.59455: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8970> <<< 12081 1726882382.59478: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12081 1726882382.59679: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 12081 1726882382.59719: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 12081 1726882382.59744: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 12081 1726882382.59747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 12081 1726882382.59786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 12081 1726882382.59789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 12081 1726882382.59800: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f2eeb0> <<< 12081 1726882382.59853: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f31f40> <<< 12081 1726882382.59872: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 12081 1726882382.59890: stdout chunk (state=3): >>>import '_sre' # <<< 12081 1726882382.59909: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 12081 1726882382.59923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 12081 1726882382.59946: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 12081 1726882382.59972: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f27610> <<< 12081 1726882382.59984: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f2d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f2e370> <<< 12081 1726882382.60010: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 12081 1726882382.60087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 12081 1726882382.60108: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 12081 1726882382.60138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.60156: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 12081 1726882382.60203: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.60207: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3eafdf0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eaf8e0> import 'itertools' # <<< 12081 1726882382.60241: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 12081 1726882382.60245: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eafee0> <<< 12081 1726882382.60252: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 12081 1726882382.60315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 12081 1726882382.60318: stdout chunk (state=3): >>>import '_operator' # <<< 12081 1726882382.60323: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eaffa0> <<< 12081 1726882382.60325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 12081 1726882382.60336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eafeb0> import '_collections' # <<< 12081 1726882382.60391: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f09d60> <<< 12081 1726882382.60396: stdout chunk (state=3): >>>import '_functools' # <<< 12081 1726882382.60412: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f02640> <<< 12081 1726882382.60481: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 12081 1726882382.60488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f156a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f35e20> <<< 12081 1726882382.60504: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 12081 1726882382.60529: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.60539: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3ec2ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f09280> <<< 12081 1726882382.60563: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.60602: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3f152b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f3b9d0> <<< 12081 1726882382.60606: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 12081 1726882382.60635: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.60667: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 12081 1726882382.60671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 12081 1726882382.60682: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2dc0> <<< 12081 1726882382.60711: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 12081 1726882382.60734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 12081 1726882382.60751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 12081 1726882382.60772: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 12081 1726882382.60789: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 12081 1726882382.60843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 12081 1726882382.60874: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 12081 1726882382.60892: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3e953a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 12081 1726882382.60904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 12081 1726882382.60930: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3e95490> <<< 12081 1726882382.61060: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec9fd0> <<< 12081 1726882382.61098: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec4a60> <<< 12081 1726882382.61113: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec4580> <<< 12081 1726882382.61137: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 12081 1726882382.61168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 12081 1726882382.61189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 12081 1726882382.61209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 12081 1726882382.61221: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3dbf1f0> <<< 12081 1726882382.61251: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3e80b80> <<< 12081 1726882382.61299: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec4ee0> <<< 12081 1726882382.61318: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f3b040> <<< 12081 1726882382.61333: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 12081 1726882382.61354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 12081 1726882382.61379: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3dd0b20> <<< 12081 1726882382.61392: stdout chunk (state=3): >>>import 'errno' # <<< 12081 1726882382.61424: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.61443: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3dd0e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 12081 1726882382.61469: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 12081 1726882382.61489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3de3760> <<< 12081 1726882382.61505: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 12081 1726882382.61534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 12081 1726882382.61566: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3de3ca0> <<< 12081 1726882382.61602: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d6f3d0> <<< 12081 1726882382.61622: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3dd0f40> <<< 12081 1726882382.61638: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 12081 1726882382.61683: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.61699: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d802b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3de35e0> import 'pwd' # <<< 12081 1726882382.61721: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.61732: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d80370> <<< 12081 1726882382.61762: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2a00> <<< 12081 1726882382.61783: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 12081 1726882382.61803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 12081 1726882382.61823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 12081 1726882382.61836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 12081 1726882382.61872: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9b6d0> <<< 12081 1726882382.61888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 12081 1726882382.61926: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9b9a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d9b790> <<< 12081 1726882382.61944: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9b880> <<< 12081 1726882382.62033: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 12081 1726882382.62173: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9bcd0> <<< 12081 1726882382.62204: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3da9220> <<< 12081 1726882382.62222: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d9b910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d8fa60> <<< 12081 1726882382.62247: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec25e0> <<< 12081 1726882382.62268: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 12081 1726882382.62330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 12081 1726882382.62360: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d9bac0> <<< 12081 1726882382.62467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 12081 1726882382.62479: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f14c3cc56a0> <<< 12081 1726882382.62638: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip' # zipimport: zlib available <<< 12081 1726882382.62729: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.62772: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/__init__.py <<< 12081 1726882382.62777: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.62780: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.62795: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 12081 1726882382.64017: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.65016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb7f0> <<< 12081 1726882382.65055: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.65061: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 12081 1726882382.65081: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 12081 1726882382.65115: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.65118: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3beb160> <<< 12081 1726882382.65155: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb280> <<< 12081 1726882382.65196: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bebf40> <<< 12081 1726882382.65208: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 12081 1726882382.65268: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bebd60> <<< 12081 1726882382.65276: stdout chunk (state=3): >>>import 'atexit' # <<< 12081 1726882382.65307: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.65310: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bebfa0> <<< 12081 1726882382.65312: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 12081 1726882382.65337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 12081 1726882382.65386: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb100> <<< 12081 1726882382.65406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 12081 1726882382.65413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 12081 1726882382.65434: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 12081 1726882382.65447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 12081 1726882382.65480: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 12081 1726882382.65549: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35edf10> <<< 12081 1726882382.65598: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.65604: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3b61f10> <<< 12081 1726882382.65628: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3b61d30> <<< 12081 1726882382.65633: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 12081 1726882382.65662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 12081 1726882382.65700: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3b613a0> <<< 12081 1726882382.65711: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c51dc0> <<< 12081 1726882382.65886: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c513a0> <<< 12081 1726882382.65910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 12081 1726882382.65941: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c51fa0> <<< 12081 1726882382.65948: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 12081 1726882382.65991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 12081 1726882382.65998: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 12081 1726882382.66001: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 12081 1726882382.66036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 12081 1726882382.66039: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 12081 1726882382.66043: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c22c70> <<< 12081 1726882382.66132: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bbdd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bbd3d0> <<< 12081 1726882382.66138: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bf44c0> <<< 12081 1726882382.66167: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bbd4f0> <<< 12081 1726882382.66196: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.66199: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bbd520> <<< 12081 1726882382.66229: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 12081 1726882382.66232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 12081 1726882382.66242: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 12081 1726882382.66276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 12081 1726882382.66346: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.66352: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35cf310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c33220> <<< 12081 1726882382.66380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 12081 1726882382.66383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 12081 1726882382.66430: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35da880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c333a0> <<< 12081 1726882382.66453: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 12081 1726882382.66489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.66522: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 12081 1726882382.66525: stdout chunk (state=3): >>>import '_string' # <<< 12081 1726882382.66583: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c4bdc0> <<< 12081 1726882382.66721: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35da820> <<< 12081 1726882382.66812: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35da670> <<< 12081 1726882382.66855: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.66859: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35d9610> <<< 12081 1726882382.66889: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.66896: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35d9520> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c2a8e0> <<< 12081 1726882382.66917: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 12081 1726882382.66942: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 12081 1726882382.66945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 12081 1726882382.66996: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bb46a0> <<< 12081 1726882382.67194: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.67202: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bb3af0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bc30a0> <<< 12081 1726882382.67235: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bb4100> <<< 12081 1726882382.67239: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bf7ac0> # zipimport: zlib available <<< 12081 1726882382.67271: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.67275: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 12081 1726882382.67290: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.67352: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.67441: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.67444: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 12081 1726882382.67471: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12081 1726882382.67489: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 12081 1726882382.67593: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.67688: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.68135: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.68598: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 12081 1726882382.68610: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 12081 1726882382.68622: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 12081 1726882382.68625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.68687: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c31da5b0> <<< 12081 1726882382.68744: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 12081 1726882382.68767: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35ab550> <<< 12081 1726882382.68772: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c317a0d0> <<< 12081 1726882382.68821: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 12081 1726882382.68829: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.68846: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.68867: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 12081 1726882382.68872: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.68995: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.69120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 12081 1726882382.69126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 12081 1726882382.69151: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bb3be0> <<< 12081 1726882382.69170: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.69537: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.69946: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.69958: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.70022: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 12081 1726882382.70025: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.70107: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.70111: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 12081 1726882382.70114: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.70158: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.70527: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 12081 1726882382.70852: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.70858: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 12081 1726882382.70861: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35b79a0> <<< 12081 1726882382.70865: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71224: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71228: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 12081 1726882382.71230: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 12081 1726882382.71243: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 12081 1726882382.71245: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71248: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71251: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 12081 1726882382.71256: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71258: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71260: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71262: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71395: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 12081 1726882382.71407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 12081 1726882382.71410: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 12081 1726882382.71413: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3c3e250> <<< 12081 1726882382.71415: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35b7f10> <<< 12081 1726882382.71453: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 12081 1726882382.71457: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71575: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71635: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71639: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.71746: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 12081 1726882382.71799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 12081 1726882382.71903: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c359d7f0> <<< 12081 1726882382.71968: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3599820> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3592a00> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 12081 1726882382.71974: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.72162: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 12081 1726882382.72252: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.72425: stdout chunk (state=3): >>># zipimport: zlib available <<< 12081 1726882382.72567: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 12081 1726882382.72832: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 12081 1726882382.72866: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util <<< 12081 1726882382.72904: stdout chunk (state=3): >>># cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 12081 1726882382.72913: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast <<< 12081 1726882382.72918: stdout chunk (state=3): >>># destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 12081 1726882382.73116: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12081 1726882382.73135: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 12081 1726882382.73151: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 12081 1726882382.73187: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 12081 1726882382.73200: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 12081 1726882382.73204: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 12081 1726882382.73206: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12081 1726882382.73223: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 12081 1726882382.73229: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 12081 1726882382.73299: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 12081 1726882382.73322: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 12081 1726882382.73347: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 12081 1726882382.73356: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select <<< 12081 1726882382.73372: stdout chunk (state=3): >>># cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 12081 1726882382.73384: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 12081 1726882382.73399: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 12081 1726882382.73427: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 12081 1726882382.73442: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools <<< 12081 1726882382.73456: stdout chunk (state=3): >>># destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 12081 1726882382.73483: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 12081 1726882382.73486: stdout chunk (state=3): >>># cleanup[3] wiping os.path <<< 12081 1726882382.73488: stdout chunk (state=3): >>># destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat <<< 12081 1726882382.73491: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 12081 1726882382.73493: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 12081 1726882382.73499: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 12081 1726882382.73501: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket <<< 12081 1726882382.73503: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 12081 1726882382.73656: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 12081 1726882382.73670: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq <<< 12081 1726882382.73674: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 12081 1726882382.73695: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 12081 1726882382.73704: stdout chunk (state=3): >>># destroy select <<< 12081 1726882382.73710: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 12081 1726882382.73743: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 12081 1726882382.74136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882382.74139: stdout chunk (state=3): >>><<< 12081 1726882382.74142: stderr chunk (state=3): >>><<< 12081 1726882382.74246: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c4243dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c4243b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c4243ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f8f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f8f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f8f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ff0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f88d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fb2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3fd8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f2eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f31f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f27610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f2d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f2e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3eafdf0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eaf8e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eafee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eaffa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3eafeb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f09d60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f02640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f156a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f35e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3ec2ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f09280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3f152b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f3b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3e953a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3e95490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec9fd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec4a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec4580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3dbf1f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3e80b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec4ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3f3b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3dd0b20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3dd0e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3de3760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3de3ca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d6f3d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3dd0f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d802b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3de35e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d80370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec2a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9b6d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9b9a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d9b790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9b880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3d9bcd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3da9220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d9b910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d8fa60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3ec25e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3d9bac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f14c3cc56a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb7f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3beb160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bebf40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bebd60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bebfa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3beb100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35edf10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3b61f10> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3b61d30> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3b613a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c51dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c513a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c51fa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c22c70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bbdd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bbd3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bf44c0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bbd4f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bbd520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35cf310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c33220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35da880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c333a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c4bdc0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35da820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35da670> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35d9610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c35d9520> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3c2a8e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bb46a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bb3af0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bc30a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3bb4100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bf7ac0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c31da5b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35ab550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c317a0d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3bb3be0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35b79a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14c3c3e250> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c35b7f10> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c359d7f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3599820> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14c3592a00> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_b56fnzb8/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 12081 1726882382.75122: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882382.75125: _low_level_execute_command(): starting 12081 1726882382.75127: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882382.2926514-12190-157183584159415/ > /dev/null 2>&1 && sleep 0' 12081 1726882382.75893: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882382.75897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.75899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.75902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.76027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.76030: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882382.76033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.76035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882382.76038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882382.76040: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882382.76042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882382.76044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882382.76061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882382.76108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882382.76116: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882382.76125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882382.76204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882382.76218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882382.76232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882382.76388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882382.78205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882382.78258: stderr chunk (state=3): >>><<< 12081 1726882382.78261: stdout chunk (state=3): >>><<< 12081 1726882382.78297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882382.78301: handler run complete 12081 1726882382.78303: attempt loop complete, returning result 12081 1726882382.78305: _execute() done 12081 1726882382.78307: dumping result to json 12081 1726882382.78309: done dumping result, returning 12081 1726882382.78312: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0e448fcc-3ce9-0a3f-ff3c-000000000028] 12081 1726882382.78314: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000028 12081 1726882382.78401: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000028 12081 1726882382.78404: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12081 1726882382.78482: no more pending results, returning what we have 12081 1726882382.78485: results queue empty 12081 1726882382.78486: checking for any_errors_fatal 12081 1726882382.78493: done checking for any_errors_fatal 12081 1726882382.78494: checking for max_fail_percentage 12081 1726882382.78496: done checking for max_fail_percentage 12081 1726882382.78496: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.78497: done checking to see if all hosts have failed 12081 1726882382.78498: getting the remaining hosts for this loop 12081 1726882382.78500: done getting the remaining hosts for this loop 12081 1726882382.78503: getting the next task for host managed_node3 12081 1726882382.78508: done getting next task for host managed_node3 12081 1726882382.78510: ^ task is: TASK: Set flag to indicate system is ostree 12081 1726882382.78515: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.78518: getting variables 12081 1726882382.78519: in VariableManager get_vars() 12081 1726882382.78548: Calling all_inventory to load vars for managed_node3 12081 1726882382.78553: Calling groups_inventory to load vars for managed_node3 12081 1726882382.78556: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.78566: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.78569: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.78571: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.78700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.78838: done with get_vars() 12081 1726882382.78869: done getting variables 12081 1726882382.78943: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:33:02 -0400 (0:00:00.568) 0:00:02.592 ****** 12081 1726882382.78990: entering _queue_task() for managed_node3/set_fact 12081 1726882382.78991: Creating lock for set_fact 12081 1726882382.79207: worker is 1 (out of 1 available) 12081 1726882382.79217: exiting _queue_task() for managed_node3/set_fact 12081 1726882382.79227: done queuing things up, now waiting for results queue to drain 12081 1726882382.79228: waiting for pending results... 12081 1726882382.79468: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 12081 1726882382.79576: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000029 12081 1726882382.79601: variable 'ansible_search_path' from source: unknown 12081 1726882382.79608: variable 'ansible_search_path' from source: unknown 12081 1726882382.79655: calling self._execute() 12081 1726882382.79740: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.79756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.79771: variable 'omit' from source: magic vars 12081 1726882382.80260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882382.80515: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882382.80576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882382.80618: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882382.80670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882382.80778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882382.80810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882382.80839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882382.80881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882382.81014: Evaluated conditional (not __network_is_ostree is defined): True 12081 1726882382.81025: variable 'omit' from source: magic vars 12081 1726882382.81072: variable 'omit' from source: magic vars 12081 1726882382.81213: variable '__ostree_booted_stat' from source: set_fact 12081 1726882382.81273: variable 'omit' from source: magic vars 12081 1726882382.81310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882382.81344: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882382.81371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882382.81395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882382.81416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882382.81452: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882382.81463: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.81478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.81587: Set connection var ansible_pipelining to False 12081 1726882382.81590: Set connection var ansible_shell_type to sh 12081 1726882382.81595: Set connection var ansible_shell_executable to /bin/sh 12081 1726882382.81598: Set connection var ansible_connection to ssh 12081 1726882382.81603: Set connection var ansible_timeout to 10 12081 1726882382.81611: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882382.81645: variable 'ansible_shell_executable' from source: unknown 12081 1726882382.81658: variable 'ansible_connection' from source: unknown 12081 1726882382.81661: variable 'ansible_module_compression' from source: unknown 12081 1726882382.81665: variable 'ansible_shell_type' from source: unknown 12081 1726882382.81668: variable 'ansible_shell_executable' from source: unknown 12081 1726882382.81670: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.81674: variable 'ansible_pipelining' from source: unknown 12081 1726882382.81676: variable 'ansible_timeout' from source: unknown 12081 1726882382.81680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.81753: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882382.81766: variable 'omit' from source: magic vars 12081 1726882382.81774: starting attempt loop 12081 1726882382.81779: running the handler 12081 1726882382.81793: handler run complete 12081 1726882382.81802: attempt loop complete, returning result 12081 1726882382.81805: _execute() done 12081 1726882382.81808: dumping result to json 12081 1726882382.81810: done dumping result, returning 12081 1726882382.81816: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-0a3f-ff3c-000000000029] 12081 1726882382.81821: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000029 ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 12081 1726882382.81948: no more pending results, returning what we have 12081 1726882382.81953: results queue empty 12081 1726882382.81954: checking for any_errors_fatal 12081 1726882382.81962: done checking for any_errors_fatal 12081 1726882382.81963: checking for max_fail_percentage 12081 1726882382.81969: done checking for max_fail_percentage 12081 1726882382.81970: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.81971: done checking to see if all hosts have failed 12081 1726882382.81971: getting the remaining hosts for this loop 12081 1726882382.81973: done getting the remaining hosts for this loop 12081 1726882382.81977: getting the next task for host managed_node3 12081 1726882382.81984: done getting next task for host managed_node3 12081 1726882382.81986: ^ task is: TASK: Fix CentOS6 Base repo 12081 1726882382.81989: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.81992: getting variables 12081 1726882382.81994: in VariableManager get_vars() 12081 1726882382.82023: Calling all_inventory to load vars for managed_node3 12081 1726882382.82025: Calling groups_inventory to load vars for managed_node3 12081 1726882382.82028: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.82040: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.82043: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.82046: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.82210: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000029 12081 1726882382.82220: WORKER PROCESS EXITING 12081 1726882382.82230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.82346: done with get_vars() 12081 1726882382.82356: done getting variables 12081 1726882382.82442: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:33:02 -0400 (0:00:00.034) 0:00:02.627 ****** 12081 1726882382.82467: entering _queue_task() for managed_node3/copy 12081 1726882382.82649: worker is 1 (out of 1 available) 12081 1726882382.82663: exiting _queue_task() for managed_node3/copy 12081 1726882382.82674: done queuing things up, now waiting for results queue to drain 12081 1726882382.82675: waiting for pending results... 12081 1726882382.82812: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 12081 1726882382.82866: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000002b 12081 1726882382.82879: variable 'ansible_search_path' from source: unknown 12081 1726882382.82884: variable 'ansible_search_path' from source: unknown 12081 1726882382.82914: calling self._execute() 12081 1726882382.82966: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.82970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.82978: variable 'omit' from source: magic vars 12081 1726882382.83295: variable 'ansible_distribution' from source: facts 12081 1726882382.83310: Evaluated conditional (ansible_distribution == 'CentOS'): True 12081 1726882382.83437: variable 'ansible_distribution_major_version' from source: facts 12081 1726882382.83448: Evaluated conditional (ansible_distribution_major_version == '6'): False 12081 1726882382.83456: when evaluation is False, skipping this task 12081 1726882382.83466: _execute() done 12081 1726882382.83478: dumping result to json 12081 1726882382.83486: done dumping result, returning 12081 1726882382.83496: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-0a3f-ff3c-00000000002b] 12081 1726882382.83507: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000002b skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12081 1726882382.83734: no more pending results, returning what we have 12081 1726882382.83737: results queue empty 12081 1726882382.83738: checking for any_errors_fatal 12081 1726882382.83742: done checking for any_errors_fatal 12081 1726882382.83743: checking for max_fail_percentage 12081 1726882382.83744: done checking for max_fail_percentage 12081 1726882382.83746: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.83747: done checking to see if all hosts have failed 12081 1726882382.83748: getting the remaining hosts for this loop 12081 1726882382.83753: done getting the remaining hosts for this loop 12081 1726882382.83757: getting the next task for host managed_node3 12081 1726882382.83766: done getting next task for host managed_node3 12081 1726882382.83770: ^ task is: TASK: Include the task 'enable_epel.yml' 12081 1726882382.83773: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.83777: getting variables 12081 1726882382.83779: in VariableManager get_vars() 12081 1726882382.83805: Calling all_inventory to load vars for managed_node3 12081 1726882382.83808: Calling groups_inventory to load vars for managed_node3 12081 1726882382.83811: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.83942: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000002b 12081 1726882382.83945: WORKER PROCESS EXITING 12081 1726882382.83957: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.83961: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.83965: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.84178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.84383: done with get_vars() 12081 1726882382.84396: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:33:02 -0400 (0:00:00.020) 0:00:02.648 ****** 12081 1726882382.84506: entering _queue_task() for managed_node3/include_tasks 12081 1726882382.84711: worker is 1 (out of 1 available) 12081 1726882382.84727: exiting _queue_task() for managed_node3/include_tasks 12081 1726882382.84737: done queuing things up, now waiting for results queue to drain 12081 1726882382.84738: waiting for pending results... 12081 1726882382.84871: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 12081 1726882382.84932: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000002c 12081 1726882382.84942: variable 'ansible_search_path' from source: unknown 12081 1726882382.84945: variable 'ansible_search_path' from source: unknown 12081 1726882382.84977: calling self._execute() 12081 1726882382.85027: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.85031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.85039: variable 'omit' from source: magic vars 12081 1726882382.85426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882382.86942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882382.86996: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882382.87024: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882382.87049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882382.87074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882382.87136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882382.87158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882382.87177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882382.87205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882382.87215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882382.87300: variable '__network_is_ostree' from source: set_fact 12081 1726882382.87314: Evaluated conditional (not __network_is_ostree | d(false)): True 12081 1726882382.87318: _execute() done 12081 1726882382.87321: dumping result to json 12081 1726882382.87324: done dumping result, returning 12081 1726882382.87331: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-0a3f-ff3c-00000000002c] 12081 1726882382.87336: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000002c 12081 1726882382.87421: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000002c 12081 1726882382.87424: WORKER PROCESS EXITING 12081 1726882382.87478: no more pending results, returning what we have 12081 1726882382.87483: in VariableManager get_vars() 12081 1726882382.87514: Calling all_inventory to load vars for managed_node3 12081 1726882382.87517: Calling groups_inventory to load vars for managed_node3 12081 1726882382.87520: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.87530: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.87537: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.87541: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.87695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.87802: done with get_vars() 12081 1726882382.87807: variable 'ansible_search_path' from source: unknown 12081 1726882382.87808: variable 'ansible_search_path' from source: unknown 12081 1726882382.87832: we have included files to process 12081 1726882382.87833: generating all_blocks data 12081 1726882382.87833: done generating all_blocks data 12081 1726882382.87837: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12081 1726882382.87838: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12081 1726882382.87839: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12081 1726882382.88296: done processing included file 12081 1726882382.88298: iterating over new_blocks loaded from include file 12081 1726882382.88299: in VariableManager get_vars() 12081 1726882382.88308: done with get_vars() 12081 1726882382.88309: filtering new block on tags 12081 1726882382.88323: done filtering new block on tags 12081 1726882382.88325: in VariableManager get_vars() 12081 1726882382.88342: done with get_vars() 12081 1726882382.88344: filtering new block on tags 12081 1726882382.88351: done filtering new block on tags 12081 1726882382.88353: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 12081 1726882382.88356: extending task lists for all hosts with included blocks 12081 1726882382.88421: done extending task lists 12081 1726882382.88422: done processing included files 12081 1726882382.88422: results queue empty 12081 1726882382.88423: checking for any_errors_fatal 12081 1726882382.88425: done checking for any_errors_fatal 12081 1726882382.88425: checking for max_fail_percentage 12081 1726882382.88426: done checking for max_fail_percentage 12081 1726882382.88427: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.88427: done checking to see if all hosts have failed 12081 1726882382.88428: getting the remaining hosts for this loop 12081 1726882382.88428: done getting the remaining hosts for this loop 12081 1726882382.88430: getting the next task for host managed_node3 12081 1726882382.88433: done getting next task for host managed_node3 12081 1726882382.88434: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 12081 1726882382.88436: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.88437: getting variables 12081 1726882382.88438: in VariableManager get_vars() 12081 1726882382.88444: Calling all_inventory to load vars for managed_node3 12081 1726882382.88445: Calling groups_inventory to load vars for managed_node3 12081 1726882382.88446: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.88450: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.88455: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.88457: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.88537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.88643: done with get_vars() 12081 1726882382.88649: done getting variables 12081 1726882382.88695: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 12081 1726882382.88824: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:33:02 -0400 (0:00:00.043) 0:00:02.691 ****** 12081 1726882382.88858: entering _queue_task() for managed_node3/command 12081 1726882382.88859: Creating lock for command 12081 1726882382.89049: worker is 1 (out of 1 available) 12081 1726882382.89063: exiting _queue_task() for managed_node3/command 12081 1726882382.89075: done queuing things up, now waiting for results queue to drain 12081 1726882382.89077: waiting for pending results... 12081 1726882382.89221: running TaskExecutor() for managed_node3/TASK: Create EPEL 9 12081 1726882382.89288: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000046 12081 1726882382.89299: variable 'ansible_search_path' from source: unknown 12081 1726882382.89302: variable 'ansible_search_path' from source: unknown 12081 1726882382.89329: calling self._execute() 12081 1726882382.89382: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.89387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.89395: variable 'omit' from source: magic vars 12081 1726882382.89649: variable 'ansible_distribution' from source: facts 12081 1726882382.89663: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12081 1726882382.89787: variable 'ansible_distribution_major_version' from source: facts 12081 1726882382.89791: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12081 1726882382.89794: when evaluation is False, skipping this task 12081 1726882382.89797: _execute() done 12081 1726882382.89800: dumping result to json 12081 1726882382.89802: done dumping result, returning 12081 1726882382.89808: done running TaskExecutor() for managed_node3/TASK: Create EPEL 9 [0e448fcc-3ce9-0a3f-ff3c-000000000046] 12081 1726882382.89814: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000046 12081 1726882382.89906: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000046 12081 1726882382.89909: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12081 1726882382.89968: no more pending results, returning what we have 12081 1726882382.89971: results queue empty 12081 1726882382.89972: checking for any_errors_fatal 12081 1726882382.89973: done checking for any_errors_fatal 12081 1726882382.89974: checking for max_fail_percentage 12081 1726882382.89976: done checking for max_fail_percentage 12081 1726882382.89976: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.89977: done checking to see if all hosts have failed 12081 1726882382.89978: getting the remaining hosts for this loop 12081 1726882382.89979: done getting the remaining hosts for this loop 12081 1726882382.89982: getting the next task for host managed_node3 12081 1726882382.89987: done getting next task for host managed_node3 12081 1726882382.89989: ^ task is: TASK: Install yum-utils package 12081 1726882382.89992: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.89995: getting variables 12081 1726882382.89997: in VariableManager get_vars() 12081 1726882382.90020: Calling all_inventory to load vars for managed_node3 12081 1726882382.90022: Calling groups_inventory to load vars for managed_node3 12081 1726882382.90025: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.90034: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.90036: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.90038: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.90174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.90288: done with get_vars() 12081 1726882382.90295: done getting variables 12081 1726882382.90366: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:33:02 -0400 (0:00:00.015) 0:00:02.706 ****** 12081 1726882382.90388: entering _queue_task() for managed_node3/package 12081 1726882382.90389: Creating lock for package 12081 1726882382.90579: worker is 1 (out of 1 available) 12081 1726882382.90591: exiting _queue_task() for managed_node3/package 12081 1726882382.90603: done queuing things up, now waiting for results queue to drain 12081 1726882382.90604: waiting for pending results... 12081 1726882382.90748: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 12081 1726882382.90813: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000047 12081 1726882382.90821: variable 'ansible_search_path' from source: unknown 12081 1726882382.90826: variable 'ansible_search_path' from source: unknown 12081 1726882382.90854: calling self._execute() 12081 1726882382.90910: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.90914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.90921: variable 'omit' from source: magic vars 12081 1726882382.91190: variable 'ansible_distribution' from source: facts 12081 1726882382.91202: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12081 1726882382.91292: variable 'ansible_distribution_major_version' from source: facts 12081 1726882382.91297: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12081 1726882382.91300: when evaluation is False, skipping this task 12081 1726882382.91303: _execute() done 12081 1726882382.91305: dumping result to json 12081 1726882382.91308: done dumping result, returning 12081 1726882382.91314: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0e448fcc-3ce9-0a3f-ff3c-000000000047] 12081 1726882382.91320: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000047 12081 1726882382.91402: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000047 12081 1726882382.91404: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12081 1726882382.91451: no more pending results, returning what we have 12081 1726882382.91454: results queue empty 12081 1726882382.91455: checking for any_errors_fatal 12081 1726882382.91467: done checking for any_errors_fatal 12081 1726882382.91467: checking for max_fail_percentage 12081 1726882382.91469: done checking for max_fail_percentage 12081 1726882382.91470: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.91471: done checking to see if all hosts have failed 12081 1726882382.91471: getting the remaining hosts for this loop 12081 1726882382.91473: done getting the remaining hosts for this loop 12081 1726882382.91476: getting the next task for host managed_node3 12081 1726882382.91482: done getting next task for host managed_node3 12081 1726882382.91484: ^ task is: TASK: Enable EPEL 7 12081 1726882382.91488: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.91491: getting variables 12081 1726882382.91492: in VariableManager get_vars() 12081 1726882382.91522: Calling all_inventory to load vars for managed_node3 12081 1726882382.91524: Calling groups_inventory to load vars for managed_node3 12081 1726882382.91527: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.91538: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.91539: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.91541: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.91649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.91764: done with get_vars() 12081 1726882382.91771: done getting variables 12081 1726882382.91810: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:33:02 -0400 (0:00:00.014) 0:00:02.721 ****** 12081 1726882382.91830: entering _queue_task() for managed_node3/command 12081 1726882382.92007: worker is 1 (out of 1 available) 12081 1726882382.92020: exiting _queue_task() for managed_node3/command 12081 1726882382.92030: done queuing things up, now waiting for results queue to drain 12081 1726882382.92031: waiting for pending results... 12081 1726882382.92177: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 12081 1726882382.92238: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000048 12081 1726882382.92247: variable 'ansible_search_path' from source: unknown 12081 1726882382.92253: variable 'ansible_search_path' from source: unknown 12081 1726882382.92289: calling self._execute() 12081 1726882382.92338: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.92341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.92350: variable 'omit' from source: magic vars 12081 1726882382.92660: variable 'ansible_distribution' from source: facts 12081 1726882382.92672: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12081 1726882382.92760: variable 'ansible_distribution_major_version' from source: facts 12081 1726882382.92766: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12081 1726882382.92769: when evaluation is False, skipping this task 12081 1726882382.92772: _execute() done 12081 1726882382.92775: dumping result to json 12081 1726882382.92778: done dumping result, returning 12081 1726882382.92784: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0e448fcc-3ce9-0a3f-ff3c-000000000048] 12081 1726882382.92789: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000048 12081 1726882382.92869: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000048 12081 1726882382.92872: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12081 1726882382.92944: no more pending results, returning what we have 12081 1726882382.92947: results queue empty 12081 1726882382.92948: checking for any_errors_fatal 12081 1726882382.92952: done checking for any_errors_fatal 12081 1726882382.92952: checking for max_fail_percentage 12081 1726882382.92953: done checking for max_fail_percentage 12081 1726882382.92954: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.92955: done checking to see if all hosts have failed 12081 1726882382.92956: getting the remaining hosts for this loop 12081 1726882382.92957: done getting the remaining hosts for this loop 12081 1726882382.92960: getting the next task for host managed_node3 12081 1726882382.92966: done getting next task for host managed_node3 12081 1726882382.92968: ^ task is: TASK: Enable EPEL 8 12081 1726882382.92972: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.92975: getting variables 12081 1726882382.92976: in VariableManager get_vars() 12081 1726882382.93000: Calling all_inventory to load vars for managed_node3 12081 1726882382.93002: Calling groups_inventory to load vars for managed_node3 12081 1726882382.93004: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.93011: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.93013: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.93015: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.93147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.93260: done with get_vars() 12081 1726882382.93268: done getting variables 12081 1726882382.93307: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:33:02 -0400 (0:00:00.014) 0:00:02.736 ****** 12081 1726882382.93329: entering _queue_task() for managed_node3/command 12081 1726882382.93501: worker is 1 (out of 1 available) 12081 1726882382.93513: exiting _queue_task() for managed_node3/command 12081 1726882382.93524: done queuing things up, now waiting for results queue to drain 12081 1726882382.93525: waiting for pending results... 12081 1726882382.93669: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 12081 1726882382.93724: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000049 12081 1726882382.93735: variable 'ansible_search_path' from source: unknown 12081 1726882382.93737: variable 'ansible_search_path' from source: unknown 12081 1726882382.93771: calling self._execute() 12081 1726882382.93822: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.93826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.93833: variable 'omit' from source: magic vars 12081 1726882382.94095: variable 'ansible_distribution' from source: facts 12081 1726882382.94106: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12081 1726882382.94193: variable 'ansible_distribution_major_version' from source: facts 12081 1726882382.94200: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12081 1726882382.94203: when evaluation is False, skipping this task 12081 1726882382.94205: _execute() done 12081 1726882382.94208: dumping result to json 12081 1726882382.94210: done dumping result, returning 12081 1726882382.94216: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0e448fcc-3ce9-0a3f-ff3c-000000000049] 12081 1726882382.94222: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000049 12081 1726882382.94308: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000049 12081 1726882382.94310: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12081 1726882382.94361: no more pending results, returning what we have 12081 1726882382.94365: results queue empty 12081 1726882382.94366: checking for any_errors_fatal 12081 1726882382.94370: done checking for any_errors_fatal 12081 1726882382.94371: checking for max_fail_percentage 12081 1726882382.94372: done checking for max_fail_percentage 12081 1726882382.94373: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.94374: done checking to see if all hosts have failed 12081 1726882382.94374: getting the remaining hosts for this loop 12081 1726882382.94376: done getting the remaining hosts for this loop 12081 1726882382.94379: getting the next task for host managed_node3 12081 1726882382.94386: done getting next task for host managed_node3 12081 1726882382.94388: ^ task is: TASK: Enable EPEL 6 12081 1726882382.94391: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.94394: getting variables 12081 1726882382.94396: in VariableManager get_vars() 12081 1726882382.94418: Calling all_inventory to load vars for managed_node3 12081 1726882382.94425: Calling groups_inventory to load vars for managed_node3 12081 1726882382.94427: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.94434: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.94436: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.94438: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.94546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.94662: done with get_vars() 12081 1726882382.94670: done getting variables 12081 1726882382.94710: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:33:02 -0400 (0:00:00.013) 0:00:02.750 ****** 12081 1726882382.94729: entering _queue_task() for managed_node3/copy 12081 1726882382.94901: worker is 1 (out of 1 available) 12081 1726882382.94913: exiting _queue_task() for managed_node3/copy 12081 1726882382.94924: done queuing things up, now waiting for results queue to drain 12081 1726882382.94925: waiting for pending results... 12081 1726882382.95061: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 12081 1726882382.95123: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000004b 12081 1726882382.95132: variable 'ansible_search_path' from source: unknown 12081 1726882382.95136: variable 'ansible_search_path' from source: unknown 12081 1726882382.95165: calling self._execute() 12081 1726882382.95216: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.95220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.95228: variable 'omit' from source: magic vars 12081 1726882382.95526: variable 'ansible_distribution' from source: facts 12081 1726882382.95537: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12081 1726882382.95616: variable 'ansible_distribution_major_version' from source: facts 12081 1726882382.95620: Evaluated conditional (ansible_distribution_major_version == '6'): False 12081 1726882382.95623: when evaluation is False, skipping this task 12081 1726882382.95625: _execute() done 12081 1726882382.95629: dumping result to json 12081 1726882382.95632: done dumping result, returning 12081 1726882382.95640: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0e448fcc-3ce9-0a3f-ff3c-00000000004b] 12081 1726882382.95643: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000004b 12081 1726882382.95728: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000004b 12081 1726882382.95731: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12081 1726882382.95783: no more pending results, returning what we have 12081 1726882382.95786: results queue empty 12081 1726882382.95787: checking for any_errors_fatal 12081 1726882382.95790: done checking for any_errors_fatal 12081 1726882382.95791: checking for max_fail_percentage 12081 1726882382.95792: done checking for max_fail_percentage 12081 1726882382.95793: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.95794: done checking to see if all hosts have failed 12081 1726882382.95795: getting the remaining hosts for this loop 12081 1726882382.95796: done getting the remaining hosts for this loop 12081 1726882382.95799: getting the next task for host managed_node3 12081 1726882382.95806: done getting next task for host managed_node3 12081 1726882382.95808: ^ task is: TASK: Set network provider to 'nm' 12081 1726882382.95810: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.95813: getting variables 12081 1726882382.95814: in VariableManager get_vars() 12081 1726882382.95836: Calling all_inventory to load vars for managed_node3 12081 1726882382.95837: Calling groups_inventory to load vars for managed_node3 12081 1726882382.95839: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.95846: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.95848: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.95857: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.95991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.96104: done with get_vars() 12081 1726882382.96110: done getting variables 12081 1726882382.96148: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:13 Friday 20 September 2024 21:33:02 -0400 (0:00:00.014) 0:00:02.764 ****** 12081 1726882382.96170: entering _queue_task() for managed_node3/set_fact 12081 1726882382.96337: worker is 1 (out of 1 available) 12081 1726882382.96352: exiting _queue_task() for managed_node3/set_fact 12081 1726882382.96363: done queuing things up, now waiting for results queue to drain 12081 1726882382.96366: waiting for pending results... 12081 1726882382.96495: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 12081 1726882382.96544: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000007 12081 1726882382.96555: variable 'ansible_search_path' from source: unknown 12081 1726882382.96586: calling self._execute() 12081 1726882382.96637: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.96641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.96648: variable 'omit' from source: magic vars 12081 1726882382.96720: variable 'omit' from source: magic vars 12081 1726882382.96742: variable 'omit' from source: magic vars 12081 1726882382.96767: variable 'omit' from source: magic vars 12081 1726882382.96800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882382.96825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882382.96843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882382.96857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882382.96867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882382.96889: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882382.96892: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.96894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.96966: Set connection var ansible_pipelining to False 12081 1726882382.96969: Set connection var ansible_shell_type to sh 12081 1726882382.96975: Set connection var ansible_shell_executable to /bin/sh 12081 1726882382.96977: Set connection var ansible_connection to ssh 12081 1726882382.96982: Set connection var ansible_timeout to 10 12081 1726882382.96987: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882382.97004: variable 'ansible_shell_executable' from source: unknown 12081 1726882382.97007: variable 'ansible_connection' from source: unknown 12081 1726882382.97009: variable 'ansible_module_compression' from source: unknown 12081 1726882382.97012: variable 'ansible_shell_type' from source: unknown 12081 1726882382.97014: variable 'ansible_shell_executable' from source: unknown 12081 1726882382.97016: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.97019: variable 'ansible_pipelining' from source: unknown 12081 1726882382.97022: variable 'ansible_timeout' from source: unknown 12081 1726882382.97024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.97128: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882382.97135: variable 'omit' from source: magic vars 12081 1726882382.97142: starting attempt loop 12081 1726882382.97145: running the handler 12081 1726882382.97155: handler run complete 12081 1726882382.97163: attempt loop complete, returning result 12081 1726882382.97167: _execute() done 12081 1726882382.97170: dumping result to json 12081 1726882382.97172: done dumping result, returning 12081 1726882382.97180: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0e448fcc-3ce9-0a3f-ff3c-000000000007] 12081 1726882382.97185: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000007 12081 1726882382.97259: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000007 12081 1726882382.97262: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 12081 1726882382.97319: no more pending results, returning what we have 12081 1726882382.97321: results queue empty 12081 1726882382.97322: checking for any_errors_fatal 12081 1726882382.97325: done checking for any_errors_fatal 12081 1726882382.97326: checking for max_fail_percentage 12081 1726882382.97327: done checking for max_fail_percentage 12081 1726882382.97328: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.97329: done checking to see if all hosts have failed 12081 1726882382.97330: getting the remaining hosts for this loop 12081 1726882382.97331: done getting the remaining hosts for this loop 12081 1726882382.97334: getting the next task for host managed_node3 12081 1726882382.97339: done getting next task for host managed_node3 12081 1726882382.97341: ^ task is: TASK: meta (flush_handlers) 12081 1726882382.97343: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.97346: getting variables 12081 1726882382.97348: in VariableManager get_vars() 12081 1726882382.97373: Calling all_inventory to load vars for managed_node3 12081 1726882382.97380: Calling groups_inventory to load vars for managed_node3 12081 1726882382.97382: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.97391: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.97393: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.97395: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.97502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.97614: done with get_vars() 12081 1726882382.97621: done getting variables 12081 1726882382.97669: in VariableManager get_vars() 12081 1726882382.97675: Calling all_inventory to load vars for managed_node3 12081 1726882382.97677: Calling groups_inventory to load vars for managed_node3 12081 1726882382.97678: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.97681: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.97683: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.97684: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.97908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.98012: done with get_vars() 12081 1726882382.98021: done queuing things up, now waiting for results queue to drain 12081 1726882382.98023: results queue empty 12081 1726882382.98023: checking for any_errors_fatal 12081 1726882382.98025: done checking for any_errors_fatal 12081 1726882382.98025: checking for max_fail_percentage 12081 1726882382.98026: done checking for max_fail_percentage 12081 1726882382.98027: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.98027: done checking to see if all hosts have failed 12081 1726882382.98028: getting the remaining hosts for this loop 12081 1726882382.98029: done getting the remaining hosts for this loop 12081 1726882382.98030: getting the next task for host managed_node3 12081 1726882382.98033: done getting next task for host managed_node3 12081 1726882382.98034: ^ task is: TASK: meta (flush_handlers) 12081 1726882382.98035: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.98041: getting variables 12081 1726882382.98042: in VariableManager get_vars() 12081 1726882382.98047: Calling all_inventory to load vars for managed_node3 12081 1726882382.98048: Calling groups_inventory to load vars for managed_node3 12081 1726882382.98051: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.98054: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.98056: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.98057: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.98131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.98240: done with get_vars() 12081 1726882382.98245: done getting variables 12081 1726882382.98280: in VariableManager get_vars() 12081 1726882382.98285: Calling all_inventory to load vars for managed_node3 12081 1726882382.98287: Calling groups_inventory to load vars for managed_node3 12081 1726882382.98288: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.98291: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.98292: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.98294: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.98387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.98490: done with get_vars() 12081 1726882382.98498: done queuing things up, now waiting for results queue to drain 12081 1726882382.98499: results queue empty 12081 1726882382.98499: checking for any_errors_fatal 12081 1726882382.98500: done checking for any_errors_fatal 12081 1726882382.98500: checking for max_fail_percentage 12081 1726882382.98501: done checking for max_fail_percentage 12081 1726882382.98502: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.98502: done checking to see if all hosts have failed 12081 1726882382.98502: getting the remaining hosts for this loop 12081 1726882382.98503: done getting the remaining hosts for this loop 12081 1726882382.98505: getting the next task for host managed_node3 12081 1726882382.98506: done getting next task for host managed_node3 12081 1726882382.98507: ^ task is: None 12081 1726882382.98508: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.98508: done queuing things up, now waiting for results queue to drain 12081 1726882382.98509: results queue empty 12081 1726882382.98509: checking for any_errors_fatal 12081 1726882382.98510: done checking for any_errors_fatal 12081 1726882382.98510: checking for max_fail_percentage 12081 1726882382.98511: done checking for max_fail_percentage 12081 1726882382.98511: checking to see if all hosts have failed and the running result is not ok 12081 1726882382.98512: done checking to see if all hosts have failed 12081 1726882382.98513: getting the next task for host managed_node3 12081 1726882382.98514: done getting next task for host managed_node3 12081 1726882382.98515: ^ task is: None 12081 1726882382.98515: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.98553: in VariableManager get_vars() 12081 1726882382.98565: done with get_vars() 12081 1726882382.98570: in VariableManager get_vars() 12081 1726882382.98576: done with get_vars() 12081 1726882382.98579: variable 'omit' from source: magic vars 12081 1726882382.98600: in VariableManager get_vars() 12081 1726882382.98606: done with get_vars() 12081 1726882382.98618: variable 'omit' from source: magic vars PLAY [Play for testing bond options] ******************************************* 12081 1726882382.98773: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12081 1726882382.98796: getting the remaining hosts for this loop 12081 1726882382.98798: done getting the remaining hosts for this loop 12081 1726882382.98799: getting the next task for host managed_node3 12081 1726882382.98801: done getting next task for host managed_node3 12081 1726882382.98802: ^ task is: TASK: Gathering Facts 12081 1726882382.98803: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882382.98804: getting variables 12081 1726882382.98805: in VariableManager get_vars() 12081 1726882382.98810: Calling all_inventory to load vars for managed_node3 12081 1726882382.98811: Calling groups_inventory to load vars for managed_node3 12081 1726882382.98813: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882382.98816: Calling all_plugins_play to load vars for managed_node3 12081 1726882382.98824: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882382.98826: Calling groups_plugins_play to load vars for managed_node3 12081 1726882382.98908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882382.99019: done with get_vars() 12081 1726882382.99023: done getting variables 12081 1726882382.99048: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 Friday 20 September 2024 21:33:02 -0400 (0:00:00.028) 0:00:02.793 ****** 12081 1726882382.99067: entering _queue_task() for managed_node3/gather_facts 12081 1726882382.99246: worker is 1 (out of 1 available) 12081 1726882382.99258: exiting _queue_task() for managed_node3/gather_facts 12081 1726882382.99271: done queuing things up, now waiting for results queue to drain 12081 1726882382.99273: waiting for pending results... 12081 1726882382.99422: running TaskExecutor() for managed_node3/TASK: Gathering Facts 12081 1726882382.99479: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000071 12081 1726882382.99489: variable 'ansible_search_path' from source: unknown 12081 1726882382.99518: calling self._execute() 12081 1726882382.99574: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882382.99578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882382.99585: variable 'omit' from source: magic vars 12081 1726882382.99919: variable 'ansible_distribution_major_version' from source: facts 12081 1726882382.99929: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882382.99939: variable 'omit' from source: magic vars 12081 1726882382.99961: variable 'omit' from source: magic vars 12081 1726882383.00002: variable 'omit' from source: magic vars 12081 1726882383.00037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882383.00075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.00090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882383.00102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.00117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.00139: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.00142: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.00144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.00211: Set connection var ansible_pipelining to False 12081 1726882383.00216: Set connection var ansible_shell_type to sh 12081 1726882383.00225: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.00228: Set connection var ansible_connection to ssh 12081 1726882383.00232: Set connection var ansible_timeout to 10 12081 1726882383.00237: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.00257: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.00261: variable 'ansible_connection' from source: unknown 12081 1726882383.00264: variable 'ansible_module_compression' from source: unknown 12081 1726882383.00268: variable 'ansible_shell_type' from source: unknown 12081 1726882383.00270: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.00273: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.00275: variable 'ansible_pipelining' from source: unknown 12081 1726882383.00277: variable 'ansible_timeout' from source: unknown 12081 1726882383.00282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.00411: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.00418: variable 'omit' from source: magic vars 12081 1726882383.00423: starting attempt loop 12081 1726882383.00426: running the handler 12081 1726882383.00441: variable 'ansible_facts' from source: unknown 12081 1726882383.00459: _low_level_execute_command(): starting 12081 1726882383.00468: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882383.00997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.01021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.01034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.01085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882383.01110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882383.01214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882383.02884: stdout chunk (state=3): >>>/root <<< 12081 1726882383.02982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882383.03036: stderr chunk (state=3): >>><<< 12081 1726882383.03042: stdout chunk (state=3): >>><<< 12081 1726882383.03070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882383.03091: _low_level_execute_command(): starting 12081 1726882383.03099: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111 `" && echo ansible-tmp-1726882383.0308056-12237-117407559659111="` echo /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111 `" ) && sleep 0' 12081 1726882383.03555: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.03571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.03587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882383.03599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.03618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.03660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882383.03679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882383.03781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882383.05685: stdout chunk (state=3): >>>ansible-tmp-1726882383.0308056-12237-117407559659111=/root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111 <<< 12081 1726882383.05806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882383.05886: stderr chunk (state=3): >>><<< 12081 1726882383.05890: stdout chunk (state=3): >>><<< 12081 1726882383.06326: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882383.0308056-12237-117407559659111=/root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882383.06331: variable 'ansible_module_compression' from source: unknown 12081 1726882383.06333: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12081 1726882383.06335: variable 'ansible_facts' from source: unknown 12081 1726882383.06337: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111/AnsiballZ_setup.py 12081 1726882383.06400: Sending initial data 12081 1726882383.06403: Sent initial data (154 bytes) 12081 1726882383.07318: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882383.07333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.07348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882383.07369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.07412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882383.07425: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882383.07439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.07458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882383.07481: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882383.07492: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882383.07505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.07518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882383.07534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.07546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882383.07557: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882383.07573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.07669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882383.07693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882383.07709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882383.07845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882383.09604: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882383.09700: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882383.09801: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp0moij3hx /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111/AnsiballZ_setup.py <<< 12081 1726882383.09896: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882383.12622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882383.12891: stderr chunk (state=3): >>><<< 12081 1726882383.12894: stdout chunk (state=3): >>><<< 12081 1726882383.12897: done transferring module to remote 12081 1726882383.12899: _low_level_execute_command(): starting 12081 1726882383.12902: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111/ /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111/AnsiballZ_setup.py && sleep 0' 12081 1726882383.13830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882383.13843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.13857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882383.13877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.13923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882383.13935: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882383.13947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.13965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882383.13978: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882383.13991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882383.14008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.14023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882383.14042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.14053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882383.14066: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882383.14079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.14157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882383.14181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882383.14196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882383.14330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882383.16088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882383.16161: stderr chunk (state=3): >>><<< 12081 1726882383.16176: stdout chunk (state=3): >>><<< 12081 1726882383.16270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882383.16274: _low_level_execute_command(): starting 12081 1726882383.16276: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111/AnsiballZ_setup.py && sleep 0' 12081 1726882383.17005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882383.17019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.17035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882383.17057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.17111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882383.17135: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882383.17152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.17176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882383.17191: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882383.17203: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882383.17216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.17231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882383.17256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.17300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882383.17388: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882383.17410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.17494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882383.17525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882383.17545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882383.17732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882383.68544: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "03", "epoch": "1726882383", "epoch_int": "1726882383", "date": "2024-09-20", "time": "21:33:03", "iso8601_micro": "2024-09-21T01:33:03.418733Z", "iso8601": "2024-09-21T01:33:03Z", "iso8601_basic": "20240920T213303418733", "iso8601_basic_short": "20240920T213303", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.48, "5m": 0.37, "15m": 0.17}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2828, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 704, "free": 2828}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 325, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264249950208, "block_size": 4096, "block_total": 65519355, "block_available": 64514148, "block_used": 1005207, "inode_total": 131071472, "inode_available": 130998785, "inode_used": 72687, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release<<< 12081 1726882383.68572: stdout chunk (state=3): >>>", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload":<<< 12081 1726882383.68582: stdout chunk (state=3): >>> "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12081 1726882383.70221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882383.70278: stderr chunk (state=3): >>><<< 12081 1726882383.70281: stdout chunk (state=3): >>><<< 12081 1726882383.70316: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "03", "epoch": "1726882383", "epoch_int": "1726882383", "date": "2024-09-20", "time": "21:33:03", "iso8601_micro": "2024-09-21T01:33:03.418733Z", "iso8601": "2024-09-21T01:33:03Z", "iso8601_basic": "20240920T213303418733", "iso8601_basic_short": "20240920T213303", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.48, "5m": 0.37, "15m": 0.17}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2828, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 704, "free": 2828}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 325, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264249950208, "block_size": 4096, "block_total": 65519355, "block_available": 64514148, "block_used": 1005207, "inode_total": 131071472, "inode_available": 130998785, "inode_used": 72687, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882383.70517: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882383.70533: _low_level_execute_command(): starting 12081 1726882383.70536: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882383.0308056-12237-117407559659111/ > /dev/null 2>&1 && sleep 0' 12081 1726882383.71009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882383.71022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882383.71046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882383.71063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882383.71114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882383.71126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882383.71236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882383.73042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882383.73099: stderr chunk (state=3): >>><<< 12081 1726882383.73104: stdout chunk (state=3): >>><<< 12081 1726882383.73118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882383.73125: handler run complete 12081 1726882383.73202: variable 'ansible_facts' from source: unknown 12081 1726882383.73273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.73459: variable 'ansible_facts' from source: unknown 12081 1726882383.73525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.73602: attempt loop complete, returning result 12081 1726882383.73605: _execute() done 12081 1726882383.73608: dumping result to json 12081 1726882383.73626: done dumping result, returning 12081 1726882383.73634: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-0a3f-ff3c-000000000071] 12081 1726882383.73639: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000071 12081 1726882383.73941: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000071 12081 1726882383.73944: WORKER PROCESS EXITING ok: [managed_node3] 12081 1726882383.74123: no more pending results, returning what we have 12081 1726882383.74125: results queue empty 12081 1726882383.74126: checking for any_errors_fatal 12081 1726882383.74127: done checking for any_errors_fatal 12081 1726882383.74127: checking for max_fail_percentage 12081 1726882383.74128: done checking for max_fail_percentage 12081 1726882383.74129: checking to see if all hosts have failed and the running result is not ok 12081 1726882383.74129: done checking to see if all hosts have failed 12081 1726882383.74130: getting the remaining hosts for this loop 12081 1726882383.74131: done getting the remaining hosts for this loop 12081 1726882383.74133: getting the next task for host managed_node3 12081 1726882383.74137: done getting next task for host managed_node3 12081 1726882383.74138: ^ task is: TASK: meta (flush_handlers) 12081 1726882383.74139: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882383.74141: getting variables 12081 1726882383.74142: in VariableManager get_vars() 12081 1726882383.74158: Calling all_inventory to load vars for managed_node3 12081 1726882383.74160: Calling groups_inventory to load vars for managed_node3 12081 1726882383.74162: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.74173: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.74175: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.74178: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.74271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.74386: done with get_vars() 12081 1726882383.74393: done getting variables 12081 1726882383.74440: in VariableManager get_vars() 12081 1726882383.74446: Calling all_inventory to load vars for managed_node3 12081 1726882383.74447: Calling groups_inventory to load vars for managed_node3 12081 1726882383.74449: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.74452: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.74453: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.74455: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.74535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.74663: done with get_vars() 12081 1726882383.74674: done queuing things up, now waiting for results queue to drain 12081 1726882383.74675: results queue empty 12081 1726882383.74676: checking for any_errors_fatal 12081 1726882383.74678: done checking for any_errors_fatal 12081 1726882383.74678: checking for max_fail_percentage 12081 1726882383.74679: done checking for max_fail_percentage 12081 1726882383.74679: checking to see if all hosts have failed and the running result is not ok 12081 1726882383.74685: done checking to see if all hosts have failed 12081 1726882383.74685: getting the remaining hosts for this loop 12081 1726882383.74686: done getting the remaining hosts for this loop 12081 1726882383.74688: getting the next task for host managed_node3 12081 1726882383.74690: done getting next task for host managed_node3 12081 1726882383.74691: ^ task is: TASK: Show playbook name 12081 1726882383.74692: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882383.74694: getting variables 12081 1726882383.74694: in VariableManager get_vars() 12081 1726882383.74699: Calling all_inventory to load vars for managed_node3 12081 1726882383.74700: Calling groups_inventory to load vars for managed_node3 12081 1726882383.74702: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.74705: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.74706: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.74707: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.74802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.74906: done with get_vars() 12081 1726882383.74911: done getting variables 12081 1726882383.74969: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:32 Friday 20 September 2024 21:33:03 -0400 (0:00:00.759) 0:00:03.552 ****** 12081 1726882383.74988: entering _queue_task() for managed_node3/debug 12081 1726882383.74989: Creating lock for debug 12081 1726882383.75185: worker is 1 (out of 1 available) 12081 1726882383.75198: exiting _queue_task() for managed_node3/debug 12081 1726882383.75210: done queuing things up, now waiting for results queue to drain 12081 1726882383.75212: waiting for pending results... 12081 1726882383.75359: running TaskExecutor() for managed_node3/TASK: Show playbook name 12081 1726882383.75418: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000000b 12081 1726882383.75427: variable 'ansible_search_path' from source: unknown 12081 1726882383.75460: calling self._execute() 12081 1726882383.75522: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.75525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.75533: variable 'omit' from source: magic vars 12081 1726882383.75936: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.75960: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.75975: variable 'omit' from source: magic vars 12081 1726882383.76009: variable 'omit' from source: magic vars 12081 1726882383.76048: variable 'omit' from source: magic vars 12081 1726882383.76102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882383.76223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.76247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882383.76274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.76290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.76323: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.76330: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.76337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.76440: Set connection var ansible_pipelining to False 12081 1726882383.76449: Set connection var ansible_shell_type to sh 12081 1726882383.76470: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.76478: Set connection var ansible_connection to ssh 12081 1726882383.76503: Set connection var ansible_timeout to 10 12081 1726882383.76528: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.76553: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.76559: variable 'ansible_connection' from source: unknown 12081 1726882383.76562: variable 'ansible_module_compression' from source: unknown 12081 1726882383.76568: variable 'ansible_shell_type' from source: unknown 12081 1726882383.76570: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.76572: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.76575: variable 'ansible_pipelining' from source: unknown 12081 1726882383.76577: variable 'ansible_timeout' from source: unknown 12081 1726882383.76582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.76926: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.77025: variable 'omit' from source: magic vars 12081 1726882383.77036: starting attempt loop 12081 1726882383.77042: running the handler 12081 1726882383.77106: handler run complete 12081 1726882383.77142: attempt loop complete, returning result 12081 1726882383.77153: _execute() done 12081 1726882383.77160: dumping result to json 12081 1726882383.77170: done dumping result, returning 12081 1726882383.77183: done running TaskExecutor() for managed_node3/TASK: Show playbook name [0e448fcc-3ce9-0a3f-ff3c-00000000000b] 12081 1726882383.77197: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000000b ok: [managed_node3] => {} MSG: this is: playbooks/tests_bond_options.yml 12081 1726882383.77348: no more pending results, returning what we have 12081 1726882383.77354: results queue empty 12081 1726882383.77355: checking for any_errors_fatal 12081 1726882383.77356: done checking for any_errors_fatal 12081 1726882383.77357: checking for max_fail_percentage 12081 1726882383.77359: done checking for max_fail_percentage 12081 1726882383.77360: checking to see if all hosts have failed and the running result is not ok 12081 1726882383.77361: done checking to see if all hosts have failed 12081 1726882383.77361: getting the remaining hosts for this loop 12081 1726882383.77367: done getting the remaining hosts for this loop 12081 1726882383.77371: getting the next task for host managed_node3 12081 1726882383.77379: done getting next task for host managed_node3 12081 1726882383.77382: ^ task is: TASK: Include the task 'run_test.yml' 12081 1726882383.77385: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882383.77390: getting variables 12081 1726882383.77391: in VariableManager get_vars() 12081 1726882383.77422: Calling all_inventory to load vars for managed_node3 12081 1726882383.77425: Calling groups_inventory to load vars for managed_node3 12081 1726882383.77429: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.77441: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.77444: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.77447: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.77636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.77817: done with get_vars() 12081 1726882383.77828: done getting variables 12081 1726882383.77934: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000000b 12081 1726882383.77937: WORKER PROCESS EXITING TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:42 Friday 20 September 2024 21:33:03 -0400 (0:00:00.030) 0:00:03.583 ****** 12081 1726882383.78008: entering _queue_task() for managed_node3/include_tasks 12081 1726882383.78252: worker is 1 (out of 1 available) 12081 1726882383.78264: exiting _queue_task() for managed_node3/include_tasks 12081 1726882383.78277: done queuing things up, now waiting for results queue to drain 12081 1726882383.78279: waiting for pending results... 12081 1726882383.78725: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 12081 1726882383.80001: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000000d 12081 1726882383.80018: variable 'ansible_search_path' from source: unknown 12081 1726882383.80067: calling self._execute() 12081 1726882383.80144: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.80163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.80182: variable 'omit' from source: magic vars 12081 1726882383.80545: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.80568: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.80583: _execute() done 12081 1726882383.80592: dumping result to json 12081 1726882383.80601: done dumping result, returning 12081 1726882383.80612: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-0a3f-ff3c-00000000000d] 12081 1726882383.80624: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000000d 12081 1726882383.80771: no more pending results, returning what we have 12081 1726882383.80777: in VariableManager get_vars() 12081 1726882383.80812: Calling all_inventory to load vars for managed_node3 12081 1726882383.80816: Calling groups_inventory to load vars for managed_node3 12081 1726882383.80819: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.80833: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.80836: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.80839: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.81090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.81294: done with get_vars() 12081 1726882383.81301: variable 'ansible_search_path' from source: unknown 12081 1726882383.81316: we have included files to process 12081 1726882383.81317: generating all_blocks data 12081 1726882383.81318: done generating all_blocks data 12081 1726882383.81319: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12081 1726882383.81320: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12081 1726882383.81322: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12081 1726882383.81883: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000000d 12081 1726882383.81886: WORKER PROCESS EXITING 12081 1726882383.82142: in VariableManager get_vars() 12081 1726882383.82160: done with get_vars() 12081 1726882383.82201: in VariableManager get_vars() 12081 1726882383.82215: done with get_vars() 12081 1726882383.82256: in VariableManager get_vars() 12081 1726882383.82272: done with get_vars() 12081 1726882383.82311: in VariableManager get_vars() 12081 1726882383.82325: done with get_vars() 12081 1726882383.82369: in VariableManager get_vars() 12081 1726882383.82384: done with get_vars() 12081 1726882383.82739: in VariableManager get_vars() 12081 1726882383.82758: done with get_vars() 12081 1726882383.82773: done processing included file 12081 1726882383.82775: iterating over new_blocks loaded from include file 12081 1726882383.82776: in VariableManager get_vars() 12081 1726882383.82788: done with get_vars() 12081 1726882383.82790: filtering new block on tags 12081 1726882383.82917: done filtering new block on tags 12081 1726882383.82920: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 12081 1726882383.82925: extending task lists for all hosts with included blocks 12081 1726882383.82962: done extending task lists 12081 1726882383.82965: done processing included files 12081 1726882383.82966: results queue empty 12081 1726882383.82966: checking for any_errors_fatal 12081 1726882383.82970: done checking for any_errors_fatal 12081 1726882383.82971: checking for max_fail_percentage 12081 1726882383.82972: done checking for max_fail_percentage 12081 1726882383.82973: checking to see if all hosts have failed and the running result is not ok 12081 1726882383.82974: done checking to see if all hosts have failed 12081 1726882383.82975: getting the remaining hosts for this loop 12081 1726882383.82976: done getting the remaining hosts for this loop 12081 1726882383.82979: getting the next task for host managed_node3 12081 1726882383.82983: done getting next task for host managed_node3 12081 1726882383.82985: ^ task is: TASK: TEST: {{ lsr_description }} 12081 1726882383.82987: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882383.82990: getting variables 12081 1726882383.82991: in VariableManager get_vars() 12081 1726882383.82999: Calling all_inventory to load vars for managed_node3 12081 1726882383.83001: Calling groups_inventory to load vars for managed_node3 12081 1726882383.83003: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.83009: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.83011: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.83013: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.83127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.83310: done with get_vars() 12081 1726882383.83318: done getting variables 12081 1726882383.83360: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882383.83486: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:33:03 -0400 (0:00:00.055) 0:00:03.638 ****** 12081 1726882383.83528: entering _queue_task() for managed_node3/debug 12081 1726882383.83893: worker is 1 (out of 1 available) 12081 1726882383.83903: exiting _queue_task() for managed_node3/debug 12081 1726882383.83915: done queuing things up, now waiting for results queue to drain 12081 1726882383.83916: waiting for pending results... 12081 1726882383.84187: running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 12081 1726882383.84310: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000088 12081 1726882383.84328: variable 'ansible_search_path' from source: unknown 12081 1726882383.84336: variable 'ansible_search_path' from source: unknown 12081 1726882383.84387: calling self._execute() 12081 1726882383.84578: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.84596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.84610: variable 'omit' from source: magic vars 12081 1726882383.84984: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.85002: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.85012: variable 'omit' from source: magic vars 12081 1726882383.85056: variable 'omit' from source: magic vars 12081 1726882383.85176: variable 'lsr_description' from source: include params 12081 1726882383.85198: variable 'omit' from source: magic vars 12081 1726882383.85245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882383.85287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.85314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882383.85338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.85365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.85401: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.85410: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.85416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.85516: Set connection var ansible_pipelining to False 12081 1726882383.85525: Set connection var ansible_shell_type to sh 12081 1726882383.85537: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.85544: Set connection var ansible_connection to ssh 12081 1726882383.85557: Set connection var ansible_timeout to 10 12081 1726882383.85572: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.85603: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.85612: variable 'ansible_connection' from source: unknown 12081 1726882383.85618: variable 'ansible_module_compression' from source: unknown 12081 1726882383.85623: variable 'ansible_shell_type' from source: unknown 12081 1726882383.85629: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.85634: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.85641: variable 'ansible_pipelining' from source: unknown 12081 1726882383.85645: variable 'ansible_timeout' from source: unknown 12081 1726882383.85654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.85804: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.85820: variable 'omit' from source: magic vars 12081 1726882383.85829: starting attempt loop 12081 1726882383.85836: running the handler 12081 1726882383.85896: handler run complete 12081 1726882383.85915: attempt loop complete, returning result 12081 1726882383.85921: _execute() done 12081 1726882383.85927: dumping result to json 12081 1726882383.85934: done dumping result, returning 12081 1726882383.85945: done running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0e448fcc-3ce9-0a3f-ff3c-000000000088] 12081 1726882383.85960: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000088 ok: [managed_node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 12081 1726882383.86120: no more pending results, returning what we have 12081 1726882383.86124: results queue empty 12081 1726882383.86125: checking for any_errors_fatal 12081 1726882383.86126: done checking for any_errors_fatal 12081 1726882383.86127: checking for max_fail_percentage 12081 1726882383.86129: done checking for max_fail_percentage 12081 1726882383.86130: checking to see if all hosts have failed and the running result is not ok 12081 1726882383.86131: done checking to see if all hosts have failed 12081 1726882383.86132: getting the remaining hosts for this loop 12081 1726882383.86134: done getting the remaining hosts for this loop 12081 1726882383.86138: getting the next task for host managed_node3 12081 1726882383.86146: done getting next task for host managed_node3 12081 1726882383.86152: ^ task is: TASK: Show item 12081 1726882383.86156: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882383.86159: getting variables 12081 1726882383.86161: in VariableManager get_vars() 12081 1726882383.86194: Calling all_inventory to load vars for managed_node3 12081 1726882383.86197: Calling groups_inventory to load vars for managed_node3 12081 1726882383.86201: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.86214: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.86217: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.86220: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.86428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.86655: done with get_vars() 12081 1726882383.86669: done getting variables 12081 1726882383.86722: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:33:03 -0400 (0:00:00.032) 0:00:03.670 ****** 12081 1726882383.86757: entering _queue_task() for managed_node3/debug 12081 1726882383.87018: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000088 12081 1726882383.87027: WORKER PROCESS EXITING 12081 1726882383.87272: worker is 1 (out of 1 available) 12081 1726882383.87283: exiting _queue_task() for managed_node3/debug 12081 1726882383.87295: done queuing things up, now waiting for results queue to drain 12081 1726882383.87297: waiting for pending results... 12081 1726882383.87552: running TaskExecutor() for managed_node3/TASK: Show item 12081 1726882383.87646: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000089 12081 1726882383.87670: variable 'ansible_search_path' from source: unknown 12081 1726882383.87677: variable 'ansible_search_path' from source: unknown 12081 1726882383.87729: variable 'omit' from source: magic vars 12081 1726882383.87861: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.87877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.87891: variable 'omit' from source: magic vars 12081 1726882383.88211: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.88229: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.88239: variable 'omit' from source: magic vars 12081 1726882383.88285: variable 'omit' from source: magic vars 12081 1726882383.88332: variable 'item' from source: unknown 12081 1726882383.88409: variable 'item' from source: unknown 12081 1726882383.88429: variable 'omit' from source: magic vars 12081 1726882383.88481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882383.88523: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.88548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882383.88574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.88589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.88624: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.88632: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.88638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.88748: Set connection var ansible_pipelining to False 12081 1726882383.88760: Set connection var ansible_shell_type to sh 12081 1726882383.88775: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.88781: Set connection var ansible_connection to ssh 12081 1726882383.88790: Set connection var ansible_timeout to 10 12081 1726882383.88798: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.88829: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.88837: variable 'ansible_connection' from source: unknown 12081 1726882383.88843: variable 'ansible_module_compression' from source: unknown 12081 1726882383.88848: variable 'ansible_shell_type' from source: unknown 12081 1726882383.88857: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.88865: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.88872: variable 'ansible_pipelining' from source: unknown 12081 1726882383.88878: variable 'ansible_timeout' from source: unknown 12081 1726882383.88885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.89028: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.89048: variable 'omit' from source: magic vars 12081 1726882383.89061: starting attempt loop 12081 1726882383.89071: running the handler 12081 1726882383.89120: variable 'lsr_description' from source: include params 12081 1726882383.89199: variable 'lsr_description' from source: include params 12081 1726882383.89213: handler run complete 12081 1726882383.89234: attempt loop complete, returning result 12081 1726882383.89262: variable 'item' from source: unknown 12081 1726882383.89327: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 12081 1726882383.89577: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.89593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.89608: variable 'omit' from source: magic vars 12081 1726882383.89778: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.89790: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.89799: variable 'omit' from source: magic vars 12081 1726882383.89819: variable 'omit' from source: magic vars 12081 1726882383.89874: variable 'item' from source: unknown 12081 1726882383.89945: variable 'item' from source: unknown 12081 1726882383.89971: variable 'omit' from source: magic vars 12081 1726882383.89997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.90012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.90025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.90043: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.90059: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.90070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.90153: Set connection var ansible_pipelining to False 12081 1726882383.90167: Set connection var ansible_shell_type to sh 12081 1726882383.90182: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.90189: Set connection var ansible_connection to ssh 12081 1726882383.90199: Set connection var ansible_timeout to 10 12081 1726882383.90209: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.90236: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.90244: variable 'ansible_connection' from source: unknown 12081 1726882383.90255: variable 'ansible_module_compression' from source: unknown 12081 1726882383.90265: variable 'ansible_shell_type' from source: unknown 12081 1726882383.90276: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.90283: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.90292: variable 'ansible_pipelining' from source: unknown 12081 1726882383.90299: variable 'ansible_timeout' from source: unknown 12081 1726882383.90306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.90402: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.90416: variable 'omit' from source: magic vars 12081 1726882383.90425: starting attempt loop 12081 1726882383.90430: running the handler 12081 1726882383.90458: variable 'lsr_setup' from source: include params 12081 1726882383.90531: variable 'lsr_setup' from source: include params 12081 1726882383.90582: handler run complete 12081 1726882383.90612: attempt loop complete, returning result 12081 1726882383.90631: variable 'item' from source: unknown 12081 1726882383.90698: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 12081 1726882383.90861: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.90875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.90886: variable 'omit' from source: magic vars 12081 1726882383.91043: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.91056: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.91068: variable 'omit' from source: magic vars 12081 1726882383.91087: variable 'omit' from source: magic vars 12081 1726882383.91127: variable 'item' from source: unknown 12081 1726882383.91197: variable 'item' from source: unknown 12081 1726882383.91216: variable 'omit' from source: magic vars 12081 1726882383.91237: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.91256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.91268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.91282: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.91289: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.91296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.91375: Set connection var ansible_pipelining to False 12081 1726882383.91382: Set connection var ansible_shell_type to sh 12081 1726882383.91393: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.91399: Set connection var ansible_connection to ssh 12081 1726882383.91407: Set connection var ansible_timeout to 10 12081 1726882383.91415: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.91436: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.91443: variable 'ansible_connection' from source: unknown 12081 1726882383.91449: variable 'ansible_module_compression' from source: unknown 12081 1726882383.91457: variable 'ansible_shell_type' from source: unknown 12081 1726882383.91468: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.91476: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.91483: variable 'ansible_pipelining' from source: unknown 12081 1726882383.91489: variable 'ansible_timeout' from source: unknown 12081 1726882383.91495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.91585: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.91596: variable 'omit' from source: magic vars 12081 1726882383.91603: starting attempt loop 12081 1726882383.91609: running the handler 12081 1726882383.91629: variable 'lsr_test' from source: include params 12081 1726882383.91702: variable 'lsr_test' from source: include params 12081 1726882383.91722: handler run complete 12081 1726882383.91739: attempt loop complete, returning result 12081 1726882383.91758: variable 'item' from source: unknown 12081 1726882383.91826: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile.yml" ] } 12081 1726882383.91974: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.91986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.91997: variable 'omit' from source: magic vars 12081 1726882383.92158: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.92170: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.92177: variable 'omit' from source: magic vars 12081 1726882383.92193: variable 'omit' from source: magic vars 12081 1726882383.92231: variable 'item' from source: unknown 12081 1726882383.92303: variable 'item' from source: unknown 12081 1726882383.92321: variable 'omit' from source: magic vars 12081 1726882383.92340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.92358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.92369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.92383: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.92390: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.92396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.92474: Set connection var ansible_pipelining to False 12081 1726882383.92481: Set connection var ansible_shell_type to sh 12081 1726882383.92493: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.92498: Set connection var ansible_connection to ssh 12081 1726882383.92507: Set connection var ansible_timeout to 10 12081 1726882383.92514: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.92536: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.92542: variable 'ansible_connection' from source: unknown 12081 1726882383.92548: variable 'ansible_module_compression' from source: unknown 12081 1726882383.92559: variable 'ansible_shell_type' from source: unknown 12081 1726882383.92571: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.92579: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.92587: variable 'ansible_pipelining' from source: unknown 12081 1726882383.92593: variable 'ansible_timeout' from source: unknown 12081 1726882383.92600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.92689: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.92701: variable 'omit' from source: magic vars 12081 1726882383.92709: starting attempt loop 12081 1726882383.92714: running the handler 12081 1726882383.92734: variable 'lsr_assert' from source: include params 12081 1726882383.92805: variable 'lsr_assert' from source: include params 12081 1726882383.92827: handler run complete 12081 1726882383.92846: attempt loop complete, returning result 12081 1726882383.92869: variable 'item' from source: unknown 12081 1726882383.92933: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_controller_device_present.yml", "tasks/assert_bond_port_profile_present.yml", "tasks/assert_bond_options.yml" ] } 12081 1726882383.93076: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.93090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.93108: variable 'omit' from source: magic vars 12081 1726882383.93275: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.93284: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.93291: variable 'omit' from source: magic vars 12081 1726882383.93305: variable 'omit' from source: magic vars 12081 1726882383.93345: variable 'item' from source: unknown 12081 1726882383.93419: variable 'item' from source: unknown 12081 1726882383.93438: variable 'omit' from source: magic vars 12081 1726882383.93469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.93482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.93493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.93506: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.93513: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.93520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.93605: Set connection var ansible_pipelining to False 12081 1726882383.93613: Set connection var ansible_shell_type to sh 12081 1726882383.93624: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.93631: Set connection var ansible_connection to ssh 12081 1726882383.93639: Set connection var ansible_timeout to 10 12081 1726882383.93647: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.93678: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.93710: variable 'ansible_connection' from source: unknown 12081 1726882383.93718: variable 'ansible_module_compression' from source: unknown 12081 1726882383.93724: variable 'ansible_shell_type' from source: unknown 12081 1726882383.93730: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.93737: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.93745: variable 'ansible_pipelining' from source: unknown 12081 1726882383.93753: variable 'ansible_timeout' from source: unknown 12081 1726882383.93761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.93892: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.93909: variable 'omit' from source: magic vars 12081 1726882383.93917: starting attempt loop 12081 1726882383.93923: running the handler 12081 1726882383.94054: handler run complete 12081 1726882383.94074: attempt loop complete, returning result 12081 1726882383.94094: variable 'item' from source: unknown 12081 1726882383.94163: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 12081 1726882383.94308: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.94321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.94334: variable 'omit' from source: magic vars 12081 1726882383.94521: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.94532: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.94540: variable 'omit' from source: magic vars 12081 1726882383.94576: variable 'omit' from source: magic vars 12081 1726882383.94617: variable 'item' from source: unknown 12081 1726882383.94706: variable 'item' from source: unknown 12081 1726882383.94733: variable 'omit' from source: magic vars 12081 1726882383.94774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.94796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.94819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.94834: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.94842: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.94849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.94953: Set connection var ansible_pipelining to False 12081 1726882383.94971: Set connection var ansible_shell_type to sh 12081 1726882383.95015: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.95022: Set connection var ansible_connection to ssh 12081 1726882383.95032: Set connection var ansible_timeout to 10 12081 1726882383.95041: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.95071: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.95079: variable 'ansible_connection' from source: unknown 12081 1726882383.95086: variable 'ansible_module_compression' from source: unknown 12081 1726882383.95093: variable 'ansible_shell_type' from source: unknown 12081 1726882383.95099: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.95106: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.95118: variable 'ansible_pipelining' from source: unknown 12081 1726882383.95126: variable 'ansible_timeout' from source: unknown 12081 1726882383.95134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.95275: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.95287: variable 'omit' from source: magic vars 12081 1726882383.95296: starting attempt loop 12081 1726882383.95302: running the handler 12081 1726882383.95323: variable 'lsr_fail_debug' from source: play vars 12081 1726882383.95396: variable 'lsr_fail_debug' from source: play vars 12081 1726882383.95418: handler run complete 12081 1726882383.95435: attempt loop complete, returning result 12081 1726882383.95461: variable 'item' from source: unknown 12081 1726882383.95527: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 12081 1726882383.95675: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.95687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.95699: variable 'omit' from source: magic vars 12081 1726882383.95846: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.95859: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.95869: variable 'omit' from source: magic vars 12081 1726882383.95890: variable 'omit' from source: magic vars 12081 1726882383.95931: variable 'item' from source: unknown 12081 1726882383.96009: variable 'item' from source: unknown 12081 1726882383.96039: variable 'omit' from source: magic vars 12081 1726882383.96073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882383.96099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.96139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882383.96176: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882383.96195: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.96208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.96299: Set connection var ansible_pipelining to False 12081 1726882383.96315: Set connection var ansible_shell_type to sh 12081 1726882383.96338: Set connection var ansible_shell_executable to /bin/sh 12081 1726882383.96341: Set connection var ansible_connection to ssh 12081 1726882383.96344: Set connection var ansible_timeout to 10 12081 1726882383.96349: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882383.96369: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.96372: variable 'ansible_connection' from source: unknown 12081 1726882383.96374: variable 'ansible_module_compression' from source: unknown 12081 1726882383.96377: variable 'ansible_shell_type' from source: unknown 12081 1726882383.96379: variable 'ansible_shell_executable' from source: unknown 12081 1726882383.96381: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.96383: variable 'ansible_pipelining' from source: unknown 12081 1726882383.96385: variable 'ansible_timeout' from source: unknown 12081 1726882383.96390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.96478: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882383.96484: variable 'omit' from source: magic vars 12081 1726882383.96487: starting attempt loop 12081 1726882383.96490: running the handler 12081 1726882383.96503: variable 'lsr_cleanup' from source: include params 12081 1726882383.96549: variable 'lsr_cleanup' from source: include params 12081 1726882383.96564: handler run complete 12081 1726882383.96576: attempt loop complete, returning result 12081 1726882383.96586: variable 'item' from source: unknown 12081 1726882383.96634: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml" ] } 12081 1726882383.96713: dumping result to json 12081 1726882383.96716: done dumping result, returning 12081 1726882383.96718: done running TaskExecutor() for managed_node3/TASK: Show item [0e448fcc-3ce9-0a3f-ff3c-000000000089] 12081 1726882383.96720: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000089 12081 1726882383.96761: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000089 12081 1726882383.96766: WORKER PROCESS EXITING 12081 1726882383.96809: no more pending results, returning what we have 12081 1726882383.96812: results queue empty 12081 1726882383.96812: checking for any_errors_fatal 12081 1726882383.96820: done checking for any_errors_fatal 12081 1726882383.96821: checking for max_fail_percentage 12081 1726882383.96822: done checking for max_fail_percentage 12081 1726882383.96823: checking to see if all hosts have failed and the running result is not ok 12081 1726882383.96824: done checking to see if all hosts have failed 12081 1726882383.96824: getting the remaining hosts for this loop 12081 1726882383.96826: done getting the remaining hosts for this loop 12081 1726882383.96829: getting the next task for host managed_node3 12081 1726882383.96835: done getting next task for host managed_node3 12081 1726882383.96838: ^ task is: TASK: Include the task 'show_interfaces.yml' 12081 1726882383.96840: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882383.96844: getting variables 12081 1726882383.96845: in VariableManager get_vars() 12081 1726882383.96881: Calling all_inventory to load vars for managed_node3 12081 1726882383.96884: Calling groups_inventory to load vars for managed_node3 12081 1726882383.96888: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.96898: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.96900: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.96903: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.97026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.97163: done with get_vars() 12081 1726882383.97172: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:33:03 -0400 (0:00:00.104) 0:00:03.775 ****** 12081 1726882383.97234: entering _queue_task() for managed_node3/include_tasks 12081 1726882383.97424: worker is 1 (out of 1 available) 12081 1726882383.97436: exiting _queue_task() for managed_node3/include_tasks 12081 1726882383.97448: done queuing things up, now waiting for results queue to drain 12081 1726882383.97452: waiting for pending results... 12081 1726882383.97596: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 12081 1726882383.97658: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000008a 12081 1726882383.97667: variable 'ansible_search_path' from source: unknown 12081 1726882383.97670: variable 'ansible_search_path' from source: unknown 12081 1726882383.97700: calling self._execute() 12081 1726882383.97757: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882383.97760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882383.97770: variable 'omit' from source: magic vars 12081 1726882383.98016: variable 'ansible_distribution_major_version' from source: facts 12081 1726882383.98027: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882383.98033: _execute() done 12081 1726882383.98036: dumping result to json 12081 1726882383.98039: done dumping result, returning 12081 1726882383.98044: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-0a3f-ff3c-00000000008a] 12081 1726882383.98053: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008a 12081 1726882383.98142: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008a 12081 1726882383.98144: WORKER PROCESS EXITING 12081 1726882383.98201: no more pending results, returning what we have 12081 1726882383.98207: in VariableManager get_vars() 12081 1726882383.98239: Calling all_inventory to load vars for managed_node3 12081 1726882383.98242: Calling groups_inventory to load vars for managed_node3 12081 1726882383.98246: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.98269: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.98272: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.98275: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.98501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.98709: done with get_vars() 12081 1726882383.98715: variable 'ansible_search_path' from source: unknown 12081 1726882383.98716: variable 'ansible_search_path' from source: unknown 12081 1726882383.98766: we have included files to process 12081 1726882383.98767: generating all_blocks data 12081 1726882383.98769: done generating all_blocks data 12081 1726882383.98775: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12081 1726882383.98776: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12081 1726882383.98778: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12081 1726882383.98927: in VariableManager get_vars() 12081 1726882383.98953: done with get_vars() 12081 1726882383.99069: done processing included file 12081 1726882383.99078: iterating over new_blocks loaded from include file 12081 1726882383.99080: in VariableManager get_vars() 12081 1726882383.99104: done with get_vars() 12081 1726882383.99112: filtering new block on tags 12081 1726882383.99175: done filtering new block on tags 12081 1726882383.99178: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 12081 1726882383.99199: extending task lists for all hosts with included blocks 12081 1726882383.99642: done extending task lists 12081 1726882383.99643: done processing included files 12081 1726882383.99643: results queue empty 12081 1726882383.99644: checking for any_errors_fatal 12081 1726882383.99648: done checking for any_errors_fatal 12081 1726882383.99648: checking for max_fail_percentage 12081 1726882383.99649: done checking for max_fail_percentage 12081 1726882383.99649: checking to see if all hosts have failed and the running result is not ok 12081 1726882383.99650: done checking to see if all hosts have failed 12081 1726882383.99651: getting the remaining hosts for this loop 12081 1726882383.99652: done getting the remaining hosts for this loop 12081 1726882383.99654: getting the next task for host managed_node3 12081 1726882383.99656: done getting next task for host managed_node3 12081 1726882383.99658: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 12081 1726882383.99660: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882383.99661: getting variables 12081 1726882383.99662: in VariableManager get_vars() 12081 1726882383.99669: Calling all_inventory to load vars for managed_node3 12081 1726882383.99670: Calling groups_inventory to load vars for managed_node3 12081 1726882383.99672: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882383.99675: Calling all_plugins_play to load vars for managed_node3 12081 1726882383.99677: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882383.99678: Calling groups_plugins_play to load vars for managed_node3 12081 1726882383.99758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882383.99867: done with get_vars() 12081 1726882383.99873: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:33:03 -0400 (0:00:00.026) 0:00:03.802 ****** 12081 1726882383.99918: entering _queue_task() for managed_node3/include_tasks 12081 1726882384.00093: worker is 1 (out of 1 available) 12081 1726882384.00103: exiting _queue_task() for managed_node3/include_tasks 12081 1726882384.00116: done queuing things up, now waiting for results queue to drain 12081 1726882384.00117: waiting for pending results... 12081 1726882384.00276: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 12081 1726882384.00335: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000000b1 12081 1726882384.00346: variable 'ansible_search_path' from source: unknown 12081 1726882384.00349: variable 'ansible_search_path' from source: unknown 12081 1726882384.00384: calling self._execute() 12081 1726882384.00444: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.00449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.00458: variable 'omit' from source: magic vars 12081 1726882384.00707: variable 'ansible_distribution_major_version' from source: facts 12081 1726882384.00717: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882384.00723: _execute() done 12081 1726882384.00726: dumping result to json 12081 1726882384.00729: done dumping result, returning 12081 1726882384.00735: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-0a3f-ff3c-0000000000b1] 12081 1726882384.00741: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000b1 12081 1726882384.00821: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000b1 12081 1726882384.00824: WORKER PROCESS EXITING 12081 1726882384.00854: no more pending results, returning what we have 12081 1726882384.00860: in VariableManager get_vars() 12081 1726882384.00888: Calling all_inventory to load vars for managed_node3 12081 1726882384.00890: Calling groups_inventory to load vars for managed_node3 12081 1726882384.00893: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882384.00902: Calling all_plugins_play to load vars for managed_node3 12081 1726882384.00905: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882384.00907: Calling groups_plugins_play to load vars for managed_node3 12081 1726882384.01017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882384.01144: done with get_vars() 12081 1726882384.01149: variable 'ansible_search_path' from source: unknown 12081 1726882384.01149: variable 'ansible_search_path' from source: unknown 12081 1726882384.01177: we have included files to process 12081 1726882384.01178: generating all_blocks data 12081 1726882384.01179: done generating all_blocks data 12081 1726882384.01180: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12081 1726882384.01181: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12081 1726882384.01182: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12081 1726882384.01396: done processing included file 12081 1726882384.01397: iterating over new_blocks loaded from include file 12081 1726882384.01398: in VariableManager get_vars() 12081 1726882384.01406: done with get_vars() 12081 1726882384.01408: filtering new block on tags 12081 1726882384.01428: done filtering new block on tags 12081 1726882384.01429: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 12081 1726882384.01432: extending task lists for all hosts with included blocks 12081 1726882384.01641: done extending task lists 12081 1726882384.01643: done processing included files 12081 1726882384.01643: results queue empty 12081 1726882384.01644: checking for any_errors_fatal 12081 1726882384.01647: done checking for any_errors_fatal 12081 1726882384.01648: checking for max_fail_percentage 12081 1726882384.01649: done checking for max_fail_percentage 12081 1726882384.01652: checking to see if all hosts have failed and the running result is not ok 12081 1726882384.01653: done checking to see if all hosts have failed 12081 1726882384.01654: getting the remaining hosts for this loop 12081 1726882384.01655: done getting the remaining hosts for this loop 12081 1726882384.01666: getting the next task for host managed_node3 12081 1726882384.01672: done getting next task for host managed_node3 12081 1726882384.01674: ^ task is: TASK: Gather current interface info 12081 1726882384.01677: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882384.01680: getting variables 12081 1726882384.01681: in VariableManager get_vars() 12081 1726882384.01688: Calling all_inventory to load vars for managed_node3 12081 1726882384.01690: Calling groups_inventory to load vars for managed_node3 12081 1726882384.01693: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882384.01708: Calling all_plugins_play to load vars for managed_node3 12081 1726882384.01711: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882384.01715: Calling groups_plugins_play to load vars for managed_node3 12081 1726882384.01867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882384.02082: done with get_vars() 12081 1726882384.02091: done getting variables 12081 1726882384.02126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:33:04 -0400 (0:00:00.022) 0:00:03.824 ****** 12081 1726882384.02166: entering _queue_task() for managed_node3/command 12081 1726882384.02398: worker is 1 (out of 1 available) 12081 1726882384.02409: exiting _queue_task() for managed_node3/command 12081 1726882384.02422: done queuing things up, now waiting for results queue to drain 12081 1726882384.02423: waiting for pending results... 12081 1726882384.02680: running TaskExecutor() for managed_node3/TASK: Gather current interface info 12081 1726882384.02802: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000000ec 12081 1726882384.02824: variable 'ansible_search_path' from source: unknown 12081 1726882384.02845: variable 'ansible_search_path' from source: unknown 12081 1726882384.02925: calling self._execute() 12081 1726882384.03355: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.03358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.03366: variable 'omit' from source: magic vars 12081 1726882384.03609: variable 'ansible_distribution_major_version' from source: facts 12081 1726882384.03619: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882384.03625: variable 'omit' from source: magic vars 12081 1726882384.03659: variable 'omit' from source: magic vars 12081 1726882384.03686: variable 'omit' from source: magic vars 12081 1726882384.03718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882384.03742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882384.03759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882384.03774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.03783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.03808: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882384.03811: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.03814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.03885: Set connection var ansible_pipelining to False 12081 1726882384.03889: Set connection var ansible_shell_type to sh 12081 1726882384.03893: Set connection var ansible_shell_executable to /bin/sh 12081 1726882384.03901: Set connection var ansible_connection to ssh 12081 1726882384.03906: Set connection var ansible_timeout to 10 12081 1726882384.03911: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882384.03928: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.03930: variable 'ansible_connection' from source: unknown 12081 1726882384.03933: variable 'ansible_module_compression' from source: unknown 12081 1726882384.03936: variable 'ansible_shell_type' from source: unknown 12081 1726882384.03938: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.03940: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.03944: variable 'ansible_pipelining' from source: unknown 12081 1726882384.03946: variable 'ansible_timeout' from source: unknown 12081 1726882384.03950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.04052: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882384.04061: variable 'omit' from source: magic vars 12081 1726882384.04067: starting attempt loop 12081 1726882384.04070: running the handler 12081 1726882384.04084: _low_level_execute_command(): starting 12081 1726882384.04091: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882384.04587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.04603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.04614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882384.04625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.04641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.04686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.04698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.04811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.06480: stdout chunk (state=3): >>>/root <<< 12081 1726882384.06578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882384.06666: stderr chunk (state=3): >>><<< 12081 1726882384.06679: stdout chunk (state=3): >>><<< 12081 1726882384.06793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882384.06797: _low_level_execute_command(): starting 12081 1726882384.06800: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331 `" && echo ansible-tmp-1726882384.0670743-12283-235869604176331="` echo /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331 `" ) && sleep 0' 12081 1726882384.07456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.07460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.07463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.07500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.07511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.07513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.07552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.07572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882384.07578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.07681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.09573: stdout chunk (state=3): >>>ansible-tmp-1726882384.0670743-12283-235869604176331=/root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331 <<< 12081 1726882384.09683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882384.09750: stderr chunk (state=3): >>><<< 12081 1726882384.09753: stdout chunk (state=3): >>><<< 12081 1726882384.10107: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882384.0670743-12283-235869604176331=/root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882384.10110: variable 'ansible_module_compression' from source: unknown 12081 1726882384.10112: ANSIBALLZ: Using generic lock for ansible.legacy.command 12081 1726882384.10114: ANSIBALLZ: Acquiring lock 12081 1726882384.10116: ANSIBALLZ: Lock acquired: 139893497835168 12081 1726882384.10118: ANSIBALLZ: Creating module 12081 1726882384.19903: ANSIBALLZ: Writing module into payload 12081 1726882384.19977: ANSIBALLZ: Writing module 12081 1726882384.19995: ANSIBALLZ: Renaming module 12081 1726882384.19999: ANSIBALLZ: Done creating module 12081 1726882384.20013: variable 'ansible_facts' from source: unknown 12081 1726882384.20064: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331/AnsiballZ_command.py 12081 1726882384.20174: Sending initial data 12081 1726882384.20178: Sent initial data (156 bytes) 12081 1726882384.21019: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882384.21029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.21041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.21055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.21095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.21103: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882384.21110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.21123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882384.21131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882384.21137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882384.21144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.21154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.21166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.21174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.21180: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882384.21189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.21259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.21279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882384.21290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.21414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.23204: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882384.23309: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882384.23403: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpcizb5r9k /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331/AnsiballZ_command.py <<< 12081 1726882384.23500: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882384.24828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882384.25082: stderr chunk (state=3): >>><<< 12081 1726882384.25086: stdout chunk (state=3): >>><<< 12081 1726882384.25088: done transferring module to remote 12081 1726882384.25090: _low_level_execute_command(): starting 12081 1726882384.25092: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331/ /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331/AnsiballZ_command.py && sleep 0' 12081 1726882384.25683: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882384.25697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.25709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.25754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.25802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.25822: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882384.25838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.25860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.25865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.25878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.25881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882384.25888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.25952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.25977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.26082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.27882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882384.27988: stderr chunk (state=3): >>><<< 12081 1726882384.27992: stdout chunk (state=3): >>><<< 12081 1726882384.27995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882384.27997: _low_level_execute_command(): starting 12081 1726882384.27999: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331/AnsiballZ_command.py && sleep 0' 12081 1726882384.28613: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882384.28628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.28646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.28677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.28719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.28736: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882384.28761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.28803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882384.28822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882384.28832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882384.28842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.28855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.28873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.28897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.28911: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882384.28926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.28987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.28998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.29118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.42738: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:33:04.422570", "end": "2024-09-20 21:33:04.425911", "delta": "0:00:00.003341", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882384.44059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882384.44065: stdout chunk (state=3): >>><<< 12081 1726882384.44068: stderr chunk (state=3): >>><<< 12081 1726882384.44204: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:33:04.422570", "end": "2024-09-20 21:33:04.425911", "delta": "0:00:00.003341", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882384.44208: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882384.44216: _low_level_execute_command(): starting 12081 1726882384.44218: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882384.0670743-12283-235869604176331/ > /dev/null 2>&1 && sleep 0' 12081 1726882384.45762: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882384.45782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.45798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.45816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.45867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.45969: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882384.45985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.46003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882384.46014: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882384.46025: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882384.46036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.46050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.46069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.46084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.46095: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882384.46108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.46298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.46322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882384.46339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.46476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.48413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882384.48417: stdout chunk (state=3): >>><<< 12081 1726882384.48420: stderr chunk (state=3): >>><<< 12081 1726882384.48674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882384.48677: handler run complete 12081 1726882384.48680: Evaluated conditional (False): False 12081 1726882384.48682: attempt loop complete, returning result 12081 1726882384.48684: _execute() done 12081 1726882384.48686: dumping result to json 12081 1726882384.48687: done dumping result, returning 12081 1726882384.48689: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-0a3f-ff3c-0000000000ec] 12081 1726882384.48691: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000ec ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003341", "end": "2024-09-20 21:33:04.425911", "rc": 0, "start": "2024-09-20 21:33:04.422570" } STDOUT: bonding_masters eth0 lo 12081 1726882384.49212: no more pending results, returning what we have 12081 1726882384.49215: results queue empty 12081 1726882384.49216: checking for any_errors_fatal 12081 1726882384.49217: done checking for any_errors_fatal 12081 1726882384.49218: checking for max_fail_percentage 12081 1726882384.49220: done checking for max_fail_percentage 12081 1726882384.49221: checking to see if all hosts have failed and the running result is not ok 12081 1726882384.49222: done checking to see if all hosts have failed 12081 1726882384.49222: getting the remaining hosts for this loop 12081 1726882384.49224: done getting the remaining hosts for this loop 12081 1726882384.49228: getting the next task for host managed_node3 12081 1726882384.49234: done getting next task for host managed_node3 12081 1726882384.49236: ^ task is: TASK: Set current_interfaces 12081 1726882384.49242: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882384.49244: getting variables 12081 1726882384.49246: in VariableManager get_vars() 12081 1726882384.49271: Calling all_inventory to load vars for managed_node3 12081 1726882384.49274: Calling groups_inventory to load vars for managed_node3 12081 1726882384.49277: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882384.49290: Calling all_plugins_play to load vars for managed_node3 12081 1726882384.49292: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882384.49295: Calling groups_plugins_play to load vars for managed_node3 12081 1726882384.49427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882384.49613: done with get_vars() 12081 1726882384.49622: done getting variables 12081 1726882384.49677: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:33:04 -0400 (0:00:00.475) 0:00:04.300 ****** 12081 1726882384.49716: entering _queue_task() for managed_node3/set_fact 12081 1726882384.51434: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000ec 12081 1726882384.51470: WORKER PROCESS EXITING 12081 1726882384.51459: worker is 1 (out of 1 available) 12081 1726882384.51477: exiting _queue_task() for managed_node3/set_fact 12081 1726882384.51486: done queuing things up, now waiting for results queue to drain 12081 1726882384.51488: waiting for pending results... 12081 1726882384.52283: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 12081 1726882384.52502: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000000ed 12081 1726882384.52517: variable 'ansible_search_path' from source: unknown 12081 1726882384.52522: variable 'ansible_search_path' from source: unknown 12081 1726882384.52562: calling self._execute() 12081 1726882384.52858: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.52865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.52876: variable 'omit' from source: magic vars 12081 1726882384.53679: variable 'ansible_distribution_major_version' from source: facts 12081 1726882384.53692: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882384.53699: variable 'omit' from source: magic vars 12081 1726882384.53753: variable 'omit' from source: magic vars 12081 1726882384.53985: variable '_current_interfaces' from source: set_fact 12081 1726882384.54160: variable 'omit' from source: magic vars 12081 1726882384.54202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882384.54302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882384.54326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882384.54342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.54357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.54388: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882384.54392: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.54395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.54611: Set connection var ansible_pipelining to False 12081 1726882384.54614: Set connection var ansible_shell_type to sh 12081 1726882384.54622: Set connection var ansible_shell_executable to /bin/sh 12081 1726882384.54624: Set connection var ansible_connection to ssh 12081 1726882384.54630: Set connection var ansible_timeout to 10 12081 1726882384.54636: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882384.54775: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.54779: variable 'ansible_connection' from source: unknown 12081 1726882384.54781: variable 'ansible_module_compression' from source: unknown 12081 1726882384.54784: variable 'ansible_shell_type' from source: unknown 12081 1726882384.54786: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.54788: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.54790: variable 'ansible_pipelining' from source: unknown 12081 1726882384.54793: variable 'ansible_timeout' from source: unknown 12081 1726882384.54797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.55043: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882384.55194: variable 'omit' from source: magic vars 12081 1726882384.55206: starting attempt loop 12081 1726882384.55213: running the handler 12081 1726882384.55228: handler run complete 12081 1726882384.55242: attempt loop complete, returning result 12081 1726882384.55249: _execute() done 12081 1726882384.55258: dumping result to json 12081 1726882384.55291: done dumping result, returning 12081 1726882384.55306: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-0a3f-ff3c-0000000000ed] 12081 1726882384.55318: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000ed 12081 1726882384.55500: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000ed 12081 1726882384.55511: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 12081 1726882384.55573: no more pending results, returning what we have 12081 1726882384.55576: results queue empty 12081 1726882384.55577: checking for any_errors_fatal 12081 1726882384.55592: done checking for any_errors_fatal 12081 1726882384.55592: checking for max_fail_percentage 12081 1726882384.55594: done checking for max_fail_percentage 12081 1726882384.55595: checking to see if all hosts have failed and the running result is not ok 12081 1726882384.55596: done checking to see if all hosts have failed 12081 1726882384.55596: getting the remaining hosts for this loop 12081 1726882384.55598: done getting the remaining hosts for this loop 12081 1726882384.55602: getting the next task for host managed_node3 12081 1726882384.55609: done getting next task for host managed_node3 12081 1726882384.55612: ^ task is: TASK: Show current_interfaces 12081 1726882384.55617: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882384.55621: getting variables 12081 1726882384.55622: in VariableManager get_vars() 12081 1726882384.55653: Calling all_inventory to load vars for managed_node3 12081 1726882384.55656: Calling groups_inventory to load vars for managed_node3 12081 1726882384.55660: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882384.55674: Calling all_plugins_play to load vars for managed_node3 12081 1726882384.55677: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882384.55680: Calling groups_plugins_play to load vars for managed_node3 12081 1726882384.55849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882384.56087: done with get_vars() 12081 1726882384.56097: done getting variables 12081 1726882384.56149: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:33:04 -0400 (0:00:00.064) 0:00:04.364 ****** 12081 1726882384.56179: entering _queue_task() for managed_node3/debug 12081 1726882384.56751: worker is 1 (out of 1 available) 12081 1726882384.56763: exiting _queue_task() for managed_node3/debug 12081 1726882384.56777: done queuing things up, now waiting for results queue to drain 12081 1726882384.56778: waiting for pending results... 12081 1726882384.57506: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 12081 1726882384.57680: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000000b2 12081 1726882384.57785: variable 'ansible_search_path' from source: unknown 12081 1726882384.57798: variable 'ansible_search_path' from source: unknown 12081 1726882384.57840: calling self._execute() 12081 1726882384.58059: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.58071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.58081: variable 'omit' from source: magic vars 12081 1726882384.58886: variable 'ansible_distribution_major_version' from source: facts 12081 1726882384.58904: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882384.58914: variable 'omit' from source: magic vars 12081 1726882384.58970: variable 'omit' from source: magic vars 12081 1726882384.59085: variable 'current_interfaces' from source: set_fact 12081 1726882384.59233: variable 'omit' from source: magic vars 12081 1726882384.59359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882384.59398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882384.59444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882384.59553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.59571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.59603: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882384.59613: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.59616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.59831: Set connection var ansible_pipelining to False 12081 1726882384.59840: Set connection var ansible_shell_type to sh 12081 1726882384.59870: Set connection var ansible_shell_executable to /bin/sh 12081 1726882384.59973: Set connection var ansible_connection to ssh 12081 1726882384.59983: Set connection var ansible_timeout to 10 12081 1726882384.59992: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882384.60019: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.60026: variable 'ansible_connection' from source: unknown 12081 1726882384.60032: variable 'ansible_module_compression' from source: unknown 12081 1726882384.60037: variable 'ansible_shell_type' from source: unknown 12081 1726882384.60043: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.60048: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.60059: variable 'ansible_pipelining' from source: unknown 12081 1726882384.60070: variable 'ansible_timeout' from source: unknown 12081 1726882384.60077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.60321: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882384.60382: variable 'omit' from source: magic vars 12081 1726882384.60392: starting attempt loop 12081 1726882384.60401: running the handler 12081 1726882384.60553: handler run complete 12081 1726882384.60576: attempt loop complete, returning result 12081 1726882384.60582: _execute() done 12081 1726882384.60588: dumping result to json 12081 1726882384.60596: done dumping result, returning 12081 1726882384.60606: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-0a3f-ff3c-0000000000b2] 12081 1726882384.60610: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000b2 ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 12081 1726882384.60758: no more pending results, returning what we have 12081 1726882384.60762: results queue empty 12081 1726882384.60766: checking for any_errors_fatal 12081 1726882384.60771: done checking for any_errors_fatal 12081 1726882384.60771: checking for max_fail_percentage 12081 1726882384.60774: done checking for max_fail_percentage 12081 1726882384.60775: checking to see if all hosts have failed and the running result is not ok 12081 1726882384.60776: done checking to see if all hosts have failed 12081 1726882384.60777: getting the remaining hosts for this loop 12081 1726882384.60779: done getting the remaining hosts for this loop 12081 1726882384.60784: getting the next task for host managed_node3 12081 1726882384.60793: done getting next task for host managed_node3 12081 1726882384.60796: ^ task is: TASK: Setup 12081 1726882384.60800: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882384.60804: getting variables 12081 1726882384.60805: in VariableManager get_vars() 12081 1726882384.60834: Calling all_inventory to load vars for managed_node3 12081 1726882384.60837: Calling groups_inventory to load vars for managed_node3 12081 1726882384.60840: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882384.60853: Calling all_plugins_play to load vars for managed_node3 12081 1726882384.60857: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882384.60860: Calling groups_plugins_play to load vars for managed_node3 12081 1726882384.61032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882384.61228: done with get_vars() 12081 1726882384.61239: done getting variables 12081 1726882384.61270: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000000b2 12081 1726882384.61273: WORKER PROCESS EXITING TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:33:04 -0400 (0:00:00.051) 0:00:04.416 ****** 12081 1726882384.61335: entering _queue_task() for managed_node3/include_tasks 12081 1726882384.61579: worker is 1 (out of 1 available) 12081 1726882384.61591: exiting _queue_task() for managed_node3/include_tasks 12081 1726882384.61605: done queuing things up, now waiting for results queue to drain 12081 1726882384.61606: waiting for pending results... 12081 1726882384.62499: running TaskExecutor() for managed_node3/TASK: Setup 12081 1726882384.62708: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000008b 12081 1726882384.62730: variable 'ansible_search_path' from source: unknown 12081 1726882384.62738: variable 'ansible_search_path' from source: unknown 12081 1726882384.62812: variable 'lsr_setup' from source: include params 12081 1726882384.63195: variable 'lsr_setup' from source: include params 12081 1726882384.63396: variable 'omit' from source: magic vars 12081 1726882384.63788: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.63804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.63821: variable 'omit' from source: magic vars 12081 1726882384.64333: variable 'ansible_distribution_major_version' from source: facts 12081 1726882384.64349: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882384.64363: variable 'item' from source: unknown 12081 1726882384.64552: variable 'item' from source: unknown 12081 1726882384.64595: variable 'item' from source: unknown 12081 1726882384.64745: variable 'item' from source: unknown 12081 1726882384.64928: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.64995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.65009: variable 'omit' from source: magic vars 12081 1726882384.65519: variable 'ansible_distribution_major_version' from source: facts 12081 1726882384.65534: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882384.65643: variable 'item' from source: unknown 12081 1726882384.65713: variable 'item' from source: unknown 12081 1726882384.65780: variable 'item' from source: unknown 12081 1726882384.65843: variable 'item' from source: unknown 12081 1726882384.66044: dumping result to json 12081 1726882384.66056: done dumping result, returning 12081 1726882384.66072: done running TaskExecutor() for managed_node3/TASK: Setup [0e448fcc-3ce9-0a3f-ff3c-00000000008b] 12081 1726882384.66082: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008b 12081 1726882384.66186: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008b 12081 1726882384.66193: WORKER PROCESS EXITING 12081 1726882384.66219: no more pending results, returning what we have 12081 1726882384.66224: in VariableManager get_vars() 12081 1726882384.66256: Calling all_inventory to load vars for managed_node3 12081 1726882384.66260: Calling groups_inventory to load vars for managed_node3 12081 1726882384.66265: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882384.66280: Calling all_plugins_play to load vars for managed_node3 12081 1726882384.66284: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882384.66287: Calling groups_plugins_play to load vars for managed_node3 12081 1726882384.66458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882384.66687: done with get_vars() 12081 1726882384.66694: variable 'ansible_search_path' from source: unknown 12081 1726882384.66695: variable 'ansible_search_path' from source: unknown 12081 1726882384.66735: variable 'ansible_search_path' from source: unknown 12081 1726882384.66736: variable 'ansible_search_path' from source: unknown 12081 1726882384.66766: we have included files to process 12081 1726882384.66768: generating all_blocks data 12081 1726882384.66769: done generating all_blocks data 12081 1726882384.66773: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12081 1726882384.66774: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12081 1726882384.66776: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12081 1726882384.69203: done processing included file 12081 1726882384.69205: iterating over new_blocks loaded from include file 12081 1726882384.69207: in VariableManager get_vars() 12081 1726882384.69223: done with get_vars() 12081 1726882384.69224: filtering new block on tags 12081 1726882384.69274: done filtering new block on tags 12081 1726882384.69277: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 12081 1726882384.69281: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12081 1726882384.69282: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12081 1726882384.69286: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12081 1726882384.69410: in VariableManager get_vars() 12081 1726882384.69427: done with get_vars() 12081 1726882384.69434: variable 'item' from source: include params 12081 1726882384.69536: variable 'item' from source: include params 12081 1726882384.69574: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12081 1726882384.70415: in VariableManager get_vars() 12081 1726882384.70434: done with get_vars() 12081 1726882384.70583: in VariableManager get_vars() 12081 1726882384.70599: done with get_vars() 12081 1726882384.70604: variable 'item' from source: include params 12081 1726882384.70670: variable 'item' from source: include params 12081 1726882384.70700: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12081 1726882384.70776: in VariableManager get_vars() 12081 1726882384.70793: done with get_vars() 12081 1726882384.70888: done processing included file 12081 1726882384.70890: iterating over new_blocks loaded from include file 12081 1726882384.70891: in VariableManager get_vars() 12081 1726882384.70903: done with get_vars() 12081 1726882384.70904: filtering new block on tags 12081 1726882384.70979: done filtering new block on tags 12081 1726882384.70982: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node3 => (item=tasks/assert_dhcp_device_present.yml) 12081 1726882384.70986: extending task lists for all hosts with included blocks 12081 1726882384.71750: done extending task lists 12081 1726882384.71752: done processing included files 12081 1726882384.71753: results queue empty 12081 1726882384.71753: checking for any_errors_fatal 12081 1726882384.71757: done checking for any_errors_fatal 12081 1726882384.71758: checking for max_fail_percentage 12081 1726882384.71760: done checking for max_fail_percentage 12081 1726882384.71760: checking to see if all hosts have failed and the running result is not ok 12081 1726882384.71761: done checking to see if all hosts have failed 12081 1726882384.71769: getting the remaining hosts for this loop 12081 1726882384.71771: done getting the remaining hosts for this loop 12081 1726882384.71774: getting the next task for host managed_node3 12081 1726882384.71778: done getting next task for host managed_node3 12081 1726882384.71780: ^ task is: TASK: Install dnsmasq 12081 1726882384.71783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882384.71785: getting variables 12081 1726882384.71786: in VariableManager get_vars() 12081 1726882384.71795: Calling all_inventory to load vars for managed_node3 12081 1726882384.71798: Calling groups_inventory to load vars for managed_node3 12081 1726882384.71800: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882384.71806: Calling all_plugins_play to load vars for managed_node3 12081 1726882384.71808: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882384.71811: Calling groups_plugins_play to load vars for managed_node3 12081 1726882384.71988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882384.72207: done with get_vars() 12081 1726882384.72216: done getting variables 12081 1726882384.72256: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:04 -0400 (0:00:00.109) 0:00:04.525 ****** 12081 1726882384.72285: entering _queue_task() for managed_node3/package 12081 1726882384.72619: worker is 1 (out of 1 available) 12081 1726882384.72635: exiting _queue_task() for managed_node3/package 12081 1726882384.72652: done queuing things up, now waiting for results queue to drain 12081 1726882384.72654: waiting for pending results... 12081 1726882384.72946: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 12081 1726882384.73038: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000112 12081 1726882384.73052: variable 'ansible_search_path' from source: unknown 12081 1726882384.73056: variable 'ansible_search_path' from source: unknown 12081 1726882384.73096: calling self._execute() 12081 1726882384.73173: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.73213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.73308: variable 'omit' from source: magic vars 12081 1726882384.74131: variable 'ansible_distribution_major_version' from source: facts 12081 1726882384.74217: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882384.74250: variable 'omit' from source: magic vars 12081 1726882384.74707: variable 'omit' from source: magic vars 12081 1726882384.74918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882384.77526: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882384.77621: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882384.77667: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882384.77708: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882384.77736: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882384.77839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882384.77878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882384.77914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882384.77960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882384.77980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882384.78100: variable '__network_is_ostree' from source: set_fact 12081 1726882384.78119: variable 'omit' from source: magic vars 12081 1726882384.78155: variable 'omit' from source: magic vars 12081 1726882384.78193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882384.78229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882384.78252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882384.78276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.78289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882384.78326: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882384.78334: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.78341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.78447: Set connection var ansible_pipelining to False 12081 1726882384.78457: Set connection var ansible_shell_type to sh 12081 1726882384.78472: Set connection var ansible_shell_executable to /bin/sh 12081 1726882384.78481: Set connection var ansible_connection to ssh 12081 1726882384.78493: Set connection var ansible_timeout to 10 12081 1726882384.78502: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882384.78531: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.78546: variable 'ansible_connection' from source: unknown 12081 1726882384.78556: variable 'ansible_module_compression' from source: unknown 12081 1726882384.78565: variable 'ansible_shell_type' from source: unknown 12081 1726882384.78572: variable 'ansible_shell_executable' from source: unknown 12081 1726882384.78579: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882384.78587: variable 'ansible_pipelining' from source: unknown 12081 1726882384.78594: variable 'ansible_timeout' from source: unknown 12081 1726882384.78601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882384.78712: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882384.78727: variable 'omit' from source: magic vars 12081 1726882384.78737: starting attempt loop 12081 1726882384.78743: running the handler 12081 1726882384.78767: variable 'ansible_facts' from source: unknown 12081 1726882384.78775: variable 'ansible_facts' from source: unknown 12081 1726882384.78826: _low_level_execute_command(): starting 12081 1726882384.78839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882384.81074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882384.81102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.81117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.81136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.81182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.81201: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882384.81215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.81233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882384.81246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882384.81259: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882384.81884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.81901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.81919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.81932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.81947: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882384.81965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.82044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.82116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882384.82134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.82399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.83949: stdout chunk (state=3): >>>/root <<< 12081 1726882384.84137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882384.84141: stdout chunk (state=3): >>><<< 12081 1726882384.84143: stderr chunk (state=3): >>><<< 12081 1726882384.84256: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882384.84262: _low_level_execute_command(): starting 12081 1726882384.84269: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219 `" && echo ansible-tmp-1726882384.8416939-12307-246580205113219="` echo /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219 `" ) && sleep 0' 12081 1726882384.85668: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882384.85793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.85805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.85816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.85856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.85862: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882384.85874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.85890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882384.85902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882384.85908: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882384.85917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882384.85926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882384.85937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882384.85945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882384.85954: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882384.85961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882384.86148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882384.86171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882384.86181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882384.86309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882384.88210: stdout chunk (state=3): >>>ansible-tmp-1726882384.8416939-12307-246580205113219=/root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219 <<< 12081 1726882384.88383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882384.88404: stderr chunk (state=3): >>><<< 12081 1726882384.88407: stdout chunk (state=3): >>><<< 12081 1726882384.88436: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882384.8416939-12307-246580205113219=/root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882384.88476: variable 'ansible_module_compression' from source: unknown 12081 1726882384.88539: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 12081 1726882384.88543: ANSIBALLZ: Acquiring lock 12081 1726882384.88546: ANSIBALLZ: Lock acquired: 139893497835168 12081 1726882384.88548: ANSIBALLZ: Creating module 12081 1726882385.05059: ANSIBALLZ: Writing module into payload 12081 1726882385.05250: ANSIBALLZ: Writing module 12081 1726882385.05277: ANSIBALLZ: Renaming module 12081 1726882385.05287: ANSIBALLZ: Done creating module 12081 1726882385.05301: variable 'ansible_facts' from source: unknown 12081 1726882385.05361: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219/AnsiballZ_dnf.py 12081 1726882385.05472: Sending initial data 12081 1726882385.05476: Sent initial data (152 bytes) 12081 1726882385.06489: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882385.06493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882385.06495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882385.06498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882385.06500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882385.06502: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882385.06504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882385.06506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882385.06508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882385.06510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882385.06517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882385.06519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882385.06521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882385.06522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882385.06524: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882385.06528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882385.06532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882385.06545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882385.06547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882385.06646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882385.08503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882385.08593: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882385.08690: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp2sore7l8 /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219/AnsiballZ_dnf.py <<< 12081 1726882385.08778: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882385.10096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882385.10246: stderr chunk (state=3): >>><<< 12081 1726882385.10249: stdout chunk (state=3): >>><<< 12081 1726882385.10273: done transferring module to remote 12081 1726882385.10284: _low_level_execute_command(): starting 12081 1726882385.10289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219/ /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219/AnsiballZ_dnf.py && sleep 0' 12081 1726882385.11015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882385.11024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882385.11030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882385.11043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882385.11067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882385.11071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882385.11082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882385.11138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882385.11141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882385.11253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882385.13001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882385.13050: stderr chunk (state=3): >>><<< 12081 1726882385.13053: stdout chunk (state=3): >>><<< 12081 1726882385.13069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882385.13074: _low_level_execute_command(): starting 12081 1726882385.13076: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219/AnsiballZ_dnf.py && sleep 0' 12081 1726882385.13512: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882385.13516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882385.13548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882385.13552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882385.13554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882385.13605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882385.13610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882385.13733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882386.15113: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12081 1726882386.21081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882386.21085: stderr chunk (state=3): >>><<< 12081 1726882386.21088: stdout chunk (state=3): >>><<< 12081 1726882386.21113: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882386.21162: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882386.21169: _low_level_execute_command(): starting 12081 1726882386.21174: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882384.8416939-12307-246580205113219/ > /dev/null 2>&1 && sleep 0' 12081 1726882386.22879: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882386.22892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.22903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.22918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.23010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882386.23017: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882386.23028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.23083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882386.23091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882386.23099: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882386.23111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.23120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.23131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.23140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882386.23145: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882386.23157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.23302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882386.23314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882386.23330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882386.23539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882386.25493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882386.25496: stdout chunk (state=3): >>><<< 12081 1726882386.25502: stderr chunk (state=3): >>><<< 12081 1726882386.25522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882386.25529: handler run complete 12081 1726882386.25698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882386.25873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882386.25911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882386.25939: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882386.25973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882386.26038: variable '__install_status' from source: unknown 12081 1726882386.26059: Evaluated conditional (__install_status is success): True 12081 1726882386.26077: attempt loop complete, returning result 12081 1726882386.26083: _execute() done 12081 1726882386.26087: dumping result to json 12081 1726882386.26092: done dumping result, returning 12081 1726882386.26100: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0e448fcc-3ce9-0a3f-ff3c-000000000112] 12081 1726882386.26107: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000112 12081 1726882386.26213: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000112 12081 1726882386.26216: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12081 1726882386.26307: no more pending results, returning what we have 12081 1726882386.26311: results queue empty 12081 1726882386.26312: checking for any_errors_fatal 12081 1726882386.26313: done checking for any_errors_fatal 12081 1726882386.26313: checking for max_fail_percentage 12081 1726882386.26315: done checking for max_fail_percentage 12081 1726882386.26316: checking to see if all hosts have failed and the running result is not ok 12081 1726882386.26317: done checking to see if all hosts have failed 12081 1726882386.26318: getting the remaining hosts for this loop 12081 1726882386.26320: done getting the remaining hosts for this loop 12081 1726882386.26323: getting the next task for host managed_node3 12081 1726882386.26329: done getting next task for host managed_node3 12081 1726882386.26331: ^ task is: TASK: Install pgrep, sysctl 12081 1726882386.26335: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882386.26338: getting variables 12081 1726882386.26340: in VariableManager get_vars() 12081 1726882386.26370: Calling all_inventory to load vars for managed_node3 12081 1726882386.26372: Calling groups_inventory to load vars for managed_node3 12081 1726882386.26376: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882386.26388: Calling all_plugins_play to load vars for managed_node3 12081 1726882386.26391: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882386.26393: Calling groups_plugins_play to load vars for managed_node3 12081 1726882386.26569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882386.26795: done with get_vars() 12081 1726882386.26805: done getting variables 12081 1726882386.26859: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:33:06 -0400 (0:00:01.546) 0:00:06.071 ****** 12081 1726882386.26891: entering _queue_task() for managed_node3/package 12081 1726882386.27434: worker is 1 (out of 1 available) 12081 1726882386.27559: exiting _queue_task() for managed_node3/package 12081 1726882386.27576: done queuing things up, now waiting for results queue to drain 12081 1726882386.27577: waiting for pending results... 12081 1726882386.29898: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12081 1726882386.29995: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000113 12081 1726882386.30007: variable 'ansible_search_path' from source: unknown 12081 1726882386.30010: variable 'ansible_search_path' from source: unknown 12081 1726882386.30046: calling self._execute() 12081 1726882386.30125: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882386.30129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882386.30139: variable 'omit' from source: magic vars 12081 1726882386.30491: variable 'ansible_distribution_major_version' from source: facts 12081 1726882386.30504: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882386.30618: variable 'ansible_os_family' from source: facts 12081 1726882386.30622: Evaluated conditional (ansible_os_family == 'RedHat'): True 12081 1726882386.31503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882386.31776: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882386.31820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882386.31852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882386.31887: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882386.31967: variable 'ansible_distribution_major_version' from source: facts 12081 1726882386.31979: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 12081 1726882386.31982: when evaluation is False, skipping this task 12081 1726882386.31985: _execute() done 12081 1726882386.31987: dumping result to json 12081 1726882386.31989: done dumping result, returning 12081 1726882386.31998: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0e448fcc-3ce9-0a3f-ff3c-000000000113] 12081 1726882386.32004: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000113 12081 1726882386.32100: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000113 12081 1726882386.32103: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 12081 1726882386.32147: no more pending results, returning what we have 12081 1726882386.32150: results queue empty 12081 1726882386.32151: checking for any_errors_fatal 12081 1726882386.32161: done checking for any_errors_fatal 12081 1726882386.32162: checking for max_fail_percentage 12081 1726882386.32165: done checking for max_fail_percentage 12081 1726882386.32166: checking to see if all hosts have failed and the running result is not ok 12081 1726882386.32167: done checking to see if all hosts have failed 12081 1726882386.32168: getting the remaining hosts for this loop 12081 1726882386.32170: done getting the remaining hosts for this loop 12081 1726882386.32174: getting the next task for host managed_node3 12081 1726882386.32181: done getting next task for host managed_node3 12081 1726882386.32184: ^ task is: TASK: Install pgrep, sysctl 12081 1726882386.32188: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882386.32192: getting variables 12081 1726882386.32194: in VariableManager get_vars() 12081 1726882386.32224: Calling all_inventory to load vars for managed_node3 12081 1726882386.32227: Calling groups_inventory to load vars for managed_node3 12081 1726882386.32230: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882386.32243: Calling all_plugins_play to load vars for managed_node3 12081 1726882386.32246: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882386.32249: Calling groups_plugins_play to load vars for managed_node3 12081 1726882386.32433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882386.32640: done with get_vars() 12081 1726882386.32651: done getting variables 12081 1726882386.32718: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:33:06 -0400 (0:00:00.058) 0:00:06.130 ****** 12081 1726882386.32753: entering _queue_task() for managed_node3/package 12081 1726882386.33522: worker is 1 (out of 1 available) 12081 1726882386.33533: exiting _queue_task() for managed_node3/package 12081 1726882386.33546: done queuing things up, now waiting for results queue to drain 12081 1726882386.33548: waiting for pending results... 12081 1726882386.33819: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12081 1726882386.33959: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000114 12081 1726882386.33981: variable 'ansible_search_path' from source: unknown 12081 1726882386.33996: variable 'ansible_search_path' from source: unknown 12081 1726882386.34038: calling self._execute() 12081 1726882386.34142: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882386.34153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882386.34176: variable 'omit' from source: magic vars 12081 1726882386.34546: variable 'ansible_distribution_major_version' from source: facts 12081 1726882386.34571: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882386.34699: variable 'ansible_os_family' from source: facts 12081 1726882386.34710: Evaluated conditional (ansible_os_family == 'RedHat'): True 12081 1726882386.34903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882386.35611: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882386.35756: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882386.35859: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882386.36637: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882386.36866: variable 'ansible_distribution_major_version' from source: facts 12081 1726882386.36885: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 12081 1726882386.36897: variable 'omit' from source: magic vars 12081 1726882386.36995: variable 'omit' from source: magic vars 12081 1726882386.37314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882386.41577: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882386.41768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882386.41947: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882386.41986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882386.42131: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882386.42226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882386.42266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882386.42373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882386.42418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882386.42470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882386.42660: variable '__network_is_ostree' from source: set_fact 12081 1726882386.42788: variable 'omit' from source: magic vars 12081 1726882386.42822: variable 'omit' from source: magic vars 12081 1726882386.42854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882386.42998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882386.43021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882386.43042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882386.43057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882386.43094: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882386.43106: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882386.43114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882386.43330: Set connection var ansible_pipelining to False 12081 1726882386.43339: Set connection var ansible_shell_type to sh 12081 1726882386.43352: Set connection var ansible_shell_executable to /bin/sh 12081 1726882386.43435: Set connection var ansible_connection to ssh 12081 1726882386.43446: Set connection var ansible_timeout to 10 12081 1726882386.43456: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882386.43491: variable 'ansible_shell_executable' from source: unknown 12081 1726882386.43500: variable 'ansible_connection' from source: unknown 12081 1726882386.43507: variable 'ansible_module_compression' from source: unknown 12081 1726882386.43514: variable 'ansible_shell_type' from source: unknown 12081 1726882386.43520: variable 'ansible_shell_executable' from source: unknown 12081 1726882386.43527: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882386.43540: variable 'ansible_pipelining' from source: unknown 12081 1726882386.43548: variable 'ansible_timeout' from source: unknown 12081 1726882386.43556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882386.43669: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882386.43686: variable 'omit' from source: magic vars 12081 1726882386.43696: starting attempt loop 12081 1726882386.43703: running the handler 12081 1726882386.43715: variable 'ansible_facts' from source: unknown 12081 1726882386.43722: variable 'ansible_facts' from source: unknown 12081 1726882386.43783: _low_level_execute_command(): starting 12081 1726882386.43796: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882386.44598: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882386.44619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.44637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.44656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.44709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882386.44737: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882386.44753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.44775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882386.44797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882386.44810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882386.44824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.44842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.44865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.44887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882386.44900: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882386.44913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.44993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882386.45018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882386.45035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882386.45173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882386.46826: stdout chunk (state=3): >>>/root <<< 12081 1726882386.47025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882386.47028: stdout chunk (state=3): >>><<< 12081 1726882386.47031: stderr chunk (state=3): >>><<< 12081 1726882386.47145: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882386.47149: _low_level_execute_command(): starting 12081 1726882386.47152: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128 `" && echo ansible-tmp-1726882386.4705138-12375-53892451482128="` echo /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128 `" ) && sleep 0' 12081 1726882386.48761: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.48769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.48810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.48814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.48816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.49008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882386.49097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882386.49260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882386.51144: stdout chunk (state=3): >>>ansible-tmp-1726882386.4705138-12375-53892451482128=/root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128 <<< 12081 1726882386.51255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882386.51340: stderr chunk (state=3): >>><<< 12081 1726882386.51343: stdout chunk (state=3): >>><<< 12081 1726882386.51472: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882386.4705138-12375-53892451482128=/root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882386.51475: variable 'ansible_module_compression' from source: unknown 12081 1726882386.51478: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12081 1726882386.51502: variable 'ansible_facts' from source: unknown 12081 1726882386.51602: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128/AnsiballZ_dnf.py 12081 1726882386.52400: Sending initial data 12081 1726882386.52404: Sent initial data (151 bytes) 12081 1726882386.54503: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.54508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.54555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.54559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.54579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882386.54588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.54680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882386.54699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882386.54822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882386.56585: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882386.56688: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882386.56793: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpr_fdbckl /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128/AnsiballZ_dnf.py <<< 12081 1726882386.56893: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882386.59021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882386.59025: stderr chunk (state=3): >>><<< 12081 1726882386.59030: stdout chunk (state=3): >>><<< 12081 1726882386.59057: done transferring module to remote 12081 1726882386.59071: _low_level_execute_command(): starting 12081 1726882386.59134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128/ /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128/AnsiballZ_dnf.py && sleep 0' 12081 1726882386.59568: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.59571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.59604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.59607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.59609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.59668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882386.59671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882386.59674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882386.59780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882386.61884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882386.61954: stderr chunk (state=3): >>><<< 12081 1726882386.61957: stdout chunk (state=3): >>><<< 12081 1726882386.62045: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882386.62048: _low_level_execute_command(): starting 12081 1726882386.62051: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128/AnsiballZ_dnf.py && sleep 0' 12081 1726882386.63014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882386.63030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.63048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.63074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.63118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882386.63130: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882386.63144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.63169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882386.63186: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882386.63197: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882386.63209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882386.63222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882386.63240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882386.63251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882386.63266: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882386.63285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882386.63361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882386.63386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882386.63403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882386.63539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882387.64645: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12081 1726882387.70708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882387.70714: stdout chunk (state=3): >>><<< 12081 1726882387.70716: stderr chunk (state=3): >>><<< 12081 1726882387.70883: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882387.70887: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882387.70890: _low_level_execute_command(): starting 12081 1726882387.70895: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882386.4705138-12375-53892451482128/ > /dev/null 2>&1 && sleep 0' 12081 1726882387.71418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882387.71421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882387.71458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882387.71462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882387.71466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882387.71468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.71553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882387.71578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882387.71593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882387.71724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882387.73546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882387.73591: stderr chunk (state=3): >>><<< 12081 1726882387.73595: stdout chunk (state=3): >>><<< 12081 1726882387.73608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882387.73614: handler run complete 12081 1726882387.73638: attempt loop complete, returning result 12081 1726882387.73641: _execute() done 12081 1726882387.73643: dumping result to json 12081 1726882387.73648: done dumping result, returning 12081 1726882387.73658: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0e448fcc-3ce9-0a3f-ff3c-000000000114] 12081 1726882387.73665: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000114 12081 1726882387.73759: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000114 12081 1726882387.73762: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12081 1726882387.73833: no more pending results, returning what we have 12081 1726882387.73836: results queue empty 12081 1726882387.73837: checking for any_errors_fatal 12081 1726882387.73841: done checking for any_errors_fatal 12081 1726882387.73842: checking for max_fail_percentage 12081 1726882387.73843: done checking for max_fail_percentage 12081 1726882387.73844: checking to see if all hosts have failed and the running result is not ok 12081 1726882387.73845: done checking to see if all hosts have failed 12081 1726882387.73846: getting the remaining hosts for this loop 12081 1726882387.73848: done getting the remaining hosts for this loop 12081 1726882387.73852: getting the next task for host managed_node3 12081 1726882387.73858: done getting next task for host managed_node3 12081 1726882387.73862: ^ task is: TASK: Create test interfaces 12081 1726882387.73867: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882387.73876: getting variables 12081 1726882387.73878: in VariableManager get_vars() 12081 1726882387.73907: Calling all_inventory to load vars for managed_node3 12081 1726882387.73909: Calling groups_inventory to load vars for managed_node3 12081 1726882387.73913: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882387.73923: Calling all_plugins_play to load vars for managed_node3 12081 1726882387.73926: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882387.73928: Calling groups_plugins_play to load vars for managed_node3 12081 1726882387.74107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882387.74359: done with get_vars() 12081 1726882387.74372: done getting variables 12081 1726882387.74477: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:33:07 -0400 (0:00:01.417) 0:00:07.548 ****** 12081 1726882387.74510: entering _queue_task() for managed_node3/shell 12081 1726882387.74516: Creating lock for shell 12081 1726882387.74800: worker is 1 (out of 1 available) 12081 1726882387.74812: exiting _queue_task() for managed_node3/shell 12081 1726882387.74823: done queuing things up, now waiting for results queue to drain 12081 1726882387.74824: waiting for pending results... 12081 1726882387.75218: running TaskExecutor() for managed_node3/TASK: Create test interfaces 12081 1726882387.75310: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000115 12081 1726882387.75327: variable 'ansible_search_path' from source: unknown 12081 1726882387.75336: variable 'ansible_search_path' from source: unknown 12081 1726882387.75371: calling self._execute() 12081 1726882387.75433: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882387.75437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882387.75446: variable 'omit' from source: magic vars 12081 1726882387.75716: variable 'ansible_distribution_major_version' from source: facts 12081 1726882387.75727: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882387.75732: variable 'omit' from source: magic vars 12081 1726882387.75768: variable 'omit' from source: magic vars 12081 1726882387.76019: variable 'dhcp_interface1' from source: play vars 12081 1726882387.76022: variable 'dhcp_interface2' from source: play vars 12081 1726882387.76039: variable 'omit' from source: magic vars 12081 1726882387.76077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882387.76101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882387.76116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882387.76129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882387.76139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882387.76166: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882387.76169: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882387.76172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882387.76238: Set connection var ansible_pipelining to False 12081 1726882387.76241: Set connection var ansible_shell_type to sh 12081 1726882387.76247: Set connection var ansible_shell_executable to /bin/sh 12081 1726882387.76253: Set connection var ansible_connection to ssh 12081 1726882387.76255: Set connection var ansible_timeout to 10 12081 1726882387.76261: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882387.76284: variable 'ansible_shell_executable' from source: unknown 12081 1726882387.76287: variable 'ansible_connection' from source: unknown 12081 1726882387.76290: variable 'ansible_module_compression' from source: unknown 12081 1726882387.76292: variable 'ansible_shell_type' from source: unknown 12081 1726882387.76295: variable 'ansible_shell_executable' from source: unknown 12081 1726882387.76297: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882387.76299: variable 'ansible_pipelining' from source: unknown 12081 1726882387.76301: variable 'ansible_timeout' from source: unknown 12081 1726882387.76304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882387.76402: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882387.76410: variable 'omit' from source: magic vars 12081 1726882387.76417: starting attempt loop 12081 1726882387.76420: running the handler 12081 1726882387.76427: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882387.76442: _low_level_execute_command(): starting 12081 1726882387.76449: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882387.76975: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882387.76991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882387.77002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882387.77014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882387.77028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.77079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882387.77092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882387.77197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882387.78771: stdout chunk (state=3): >>>/root <<< 12081 1726882387.78871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882387.78924: stderr chunk (state=3): >>><<< 12081 1726882387.78927: stdout chunk (state=3): >>><<< 12081 1726882387.78947: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882387.78959: _low_level_execute_command(): starting 12081 1726882387.78972: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984 `" && echo ansible-tmp-1726882387.7894676-12416-13275984700984="` echo /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984 `" ) && sleep 0' 12081 1726882387.79433: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882387.79460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882387.79476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882387.79487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.79532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882387.79544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882387.79658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882387.81518: stdout chunk (state=3): >>>ansible-tmp-1726882387.7894676-12416-13275984700984=/root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984 <<< 12081 1726882387.81624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882387.81685: stderr chunk (state=3): >>><<< 12081 1726882387.81688: stdout chunk (state=3): >>><<< 12081 1726882387.81705: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882387.7894676-12416-13275984700984=/root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882387.81732: variable 'ansible_module_compression' from source: unknown 12081 1726882387.81785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882387.81805: variable 'ansible_facts' from source: unknown 12081 1726882387.81875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984/AnsiballZ_command.py 12081 1726882387.81985: Sending initial data 12081 1726882387.81994: Sent initial data (155 bytes) 12081 1726882387.82687: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882387.82690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882387.82723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.82728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882387.82730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.82790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882387.82793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882387.82801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882387.82898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882387.84635: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882387.84727: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882387.84827: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp4nv6c_51 /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984/AnsiballZ_command.py <<< 12081 1726882387.84921: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882387.85946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882387.86049: stderr chunk (state=3): >>><<< 12081 1726882387.86055: stdout chunk (state=3): >>><<< 12081 1726882387.86075: done transferring module to remote 12081 1726882387.86083: _low_level_execute_command(): starting 12081 1726882387.86088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984/ /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984/AnsiballZ_command.py && sleep 0' 12081 1726882387.86539: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882387.86543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882387.86578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882387.86591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882387.86602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.86649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882387.86662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882387.86775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882387.88507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882387.88557: stderr chunk (state=3): >>><<< 12081 1726882387.88560: stdout chunk (state=3): >>><<< 12081 1726882387.88575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882387.88578: _low_level_execute_command(): starting 12081 1726882387.88583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984/AnsiballZ_command.py && sleep 0' 12081 1726882387.89013: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882387.89019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882387.89049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.89068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882387.89079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882387.89122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882387.89133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882387.89243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.23887: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 12081 1726882389.23893: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:08.022452", "end": "2024-09-20 21:33:09.236792", "delta": "0:00:01.214340", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882389.25257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882389.25261: stdout chunk (state=3): >>><<< 12081 1726882389.25265: stderr chunk (state=3): >>><<< 12081 1726882389.25439: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:08.022452", "end": "2024-09-20 21:33:09.236792", "delta": "0:00:01.214340", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882389.25447: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882389.25450: _low_level_execute_command(): starting 12081 1726882389.25452: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882387.7894676-12416-13275984700984/ > /dev/null 2>&1 && sleep 0' 12081 1726882389.26067: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882389.26080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.26098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.26120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.26162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.26175: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882389.26188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.26209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882389.26223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882389.26232: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882389.26242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.26254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.26270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.26280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.26289: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882389.26305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.26386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882389.26406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882389.26426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882389.26562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.28604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882389.28682: stderr chunk (state=3): >>><<< 12081 1726882389.28698: stdout chunk (state=3): >>><<< 12081 1726882389.28770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882389.28774: handler run complete 12081 1726882389.28776: Evaluated conditional (False): False 12081 1726882389.28778: attempt loop complete, returning result 12081 1726882389.28781: _execute() done 12081 1726882389.28786: dumping result to json 12081 1726882389.28972: done dumping result, returning 12081 1726882389.28975: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000115] 12081 1726882389.28977: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000115 12081 1726882389.29053: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000115 12081 1726882389.29056: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.214340", "end": "2024-09-20 21:33:09.236792", "rc": 0, "start": "2024-09-20 21:33:08.022452" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 619 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 619 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 12081 1726882389.29153: no more pending results, returning what we have 12081 1726882389.29156: results queue empty 12081 1726882389.29157: checking for any_errors_fatal 12081 1726882389.29171: done checking for any_errors_fatal 12081 1726882389.29172: checking for max_fail_percentage 12081 1726882389.29174: done checking for max_fail_percentage 12081 1726882389.29175: checking to see if all hosts have failed and the running result is not ok 12081 1726882389.29176: done checking to see if all hosts have failed 12081 1726882389.29177: getting the remaining hosts for this loop 12081 1726882389.29179: done getting the remaining hosts for this loop 12081 1726882389.29182: getting the next task for host managed_node3 12081 1726882389.29192: done getting next task for host managed_node3 12081 1726882389.29195: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12081 1726882389.29200: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882389.29204: getting variables 12081 1726882389.29206: in VariableManager get_vars() 12081 1726882389.29236: Calling all_inventory to load vars for managed_node3 12081 1726882389.29238: Calling groups_inventory to load vars for managed_node3 12081 1726882389.29242: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882389.29254: Calling all_plugins_play to load vars for managed_node3 12081 1726882389.29257: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882389.29260: Calling groups_plugins_play to load vars for managed_node3 12081 1726882389.29439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882389.29817: done with get_vars() 12081 1726882389.29827: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:09 -0400 (0:00:01.554) 0:00:09.102 ****** 12081 1726882389.29931: entering _queue_task() for managed_node3/include_tasks 12081 1726882389.30213: worker is 1 (out of 1 available) 12081 1726882389.30224: exiting _queue_task() for managed_node3/include_tasks 12081 1726882389.30240: done queuing things up, now waiting for results queue to drain 12081 1726882389.30241: waiting for pending results... 12081 1726882389.30492: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12081 1726882389.30618: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000011c 12081 1726882389.30634: variable 'ansible_search_path' from source: unknown 12081 1726882389.30642: variable 'ansible_search_path' from source: unknown 12081 1726882389.30687: calling self._execute() 12081 1726882389.30761: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.30777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.30793: variable 'omit' from source: magic vars 12081 1726882389.31142: variable 'ansible_distribution_major_version' from source: facts 12081 1726882389.31159: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882389.31172: _execute() done 12081 1726882389.31179: dumping result to json 12081 1726882389.31185: done dumping result, returning 12081 1726882389.31194: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-00000000011c] 12081 1726882389.31210: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000011c 12081 1726882389.31307: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000011c 12081 1726882389.31320: WORKER PROCESS EXITING 12081 1726882389.31347: no more pending results, returning what we have 12081 1726882389.31352: in VariableManager get_vars() 12081 1726882389.31387: Calling all_inventory to load vars for managed_node3 12081 1726882389.31391: Calling groups_inventory to load vars for managed_node3 12081 1726882389.31394: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882389.31409: Calling all_plugins_play to load vars for managed_node3 12081 1726882389.31412: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882389.31415: Calling groups_plugins_play to load vars for managed_node3 12081 1726882389.31651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882389.31834: done with get_vars() 12081 1726882389.31842: variable 'ansible_search_path' from source: unknown 12081 1726882389.31843: variable 'ansible_search_path' from source: unknown 12081 1726882389.31890: we have included files to process 12081 1726882389.31892: generating all_blocks data 12081 1726882389.31893: done generating all_blocks data 12081 1726882389.31902: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882389.31903: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882389.31906: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882389.33671: done processing included file 12081 1726882389.33673: iterating over new_blocks loaded from include file 12081 1726882389.33674: in VariableManager get_vars() 12081 1726882389.33690: done with get_vars() 12081 1726882389.33692: filtering new block on tags 12081 1726882389.33723: done filtering new block on tags 12081 1726882389.33726: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12081 1726882389.33733: extending task lists for all hosts with included blocks 12081 1726882389.33945: done extending task lists 12081 1726882389.33946: done processing included files 12081 1726882389.33947: results queue empty 12081 1726882389.33948: checking for any_errors_fatal 12081 1726882389.33954: done checking for any_errors_fatal 12081 1726882389.33955: checking for max_fail_percentage 12081 1726882389.33956: done checking for max_fail_percentage 12081 1726882389.33957: checking to see if all hosts have failed and the running result is not ok 12081 1726882389.33958: done checking to see if all hosts have failed 12081 1726882389.33959: getting the remaining hosts for this loop 12081 1726882389.33960: done getting the remaining hosts for this loop 12081 1726882389.34866: getting the next task for host managed_node3 12081 1726882389.34872: done getting next task for host managed_node3 12081 1726882389.34875: ^ task is: TASK: Get stat for interface {{ interface }} 12081 1726882389.34879: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882389.34882: getting variables 12081 1726882389.34882: in VariableManager get_vars() 12081 1726882389.34891: Calling all_inventory to load vars for managed_node3 12081 1726882389.34894: Calling groups_inventory to load vars for managed_node3 12081 1726882389.34896: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882389.34901: Calling all_plugins_play to load vars for managed_node3 12081 1726882389.34904: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882389.34907: Calling groups_plugins_play to load vars for managed_node3 12081 1726882389.35055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882389.35252: done with get_vars() 12081 1726882389.35260: done getting variables 12081 1726882389.35408: variable 'interface' from source: task vars 12081 1726882389.35412: variable 'dhcp_interface1' from source: play vars 12081 1726882389.35479: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:09 -0400 (0:00:00.055) 0:00:09.158 ****** 12081 1726882389.35511: entering _queue_task() for managed_node3/stat 12081 1726882389.35756: worker is 1 (out of 1 available) 12081 1726882389.36272: exiting _queue_task() for managed_node3/stat 12081 1726882389.36285: done queuing things up, now waiting for results queue to drain 12081 1726882389.36286: waiting for pending results... 12081 1726882389.36701: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 12081 1726882389.37036: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000017b 12081 1726882389.37058: variable 'ansible_search_path' from source: unknown 12081 1726882389.37069: variable 'ansible_search_path' from source: unknown 12081 1726882389.37112: calling self._execute() 12081 1726882389.37243: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.37360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.37376: variable 'omit' from source: magic vars 12081 1726882389.38249: variable 'ansible_distribution_major_version' from source: facts 12081 1726882389.38369: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882389.38381: variable 'omit' from source: magic vars 12081 1726882389.38534: variable 'omit' from source: magic vars 12081 1726882389.38785: variable 'interface' from source: task vars 12081 1726882389.38832: variable 'dhcp_interface1' from source: play vars 12081 1726882389.39047: variable 'dhcp_interface1' from source: play vars 12081 1726882389.39077: variable 'omit' from source: magic vars 12081 1726882389.39125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882389.39284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882389.39310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882389.39332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882389.39349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882389.39392: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882389.39483: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.39492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.39687: Set connection var ansible_pipelining to False 12081 1726882389.39717: Set connection var ansible_shell_type to sh 12081 1726882389.39730: Set connection var ansible_shell_executable to /bin/sh 12081 1726882389.39779: Set connection var ansible_connection to ssh 12081 1726882389.39791: Set connection var ansible_timeout to 10 12081 1726882389.39807: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882389.39836: variable 'ansible_shell_executable' from source: unknown 12081 1726882389.39883: variable 'ansible_connection' from source: unknown 12081 1726882389.39891: variable 'ansible_module_compression' from source: unknown 12081 1726882389.39897: variable 'ansible_shell_type' from source: unknown 12081 1726882389.39904: variable 'ansible_shell_executable' from source: unknown 12081 1726882389.39917: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.39941: variable 'ansible_pipelining' from source: unknown 12081 1726882389.39974: variable 'ansible_timeout' from source: unknown 12081 1726882389.39984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.41154: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882389.41694: variable 'omit' from source: magic vars 12081 1726882389.41704: starting attempt loop 12081 1726882389.41712: running the handler 12081 1726882389.41727: _low_level_execute_command(): starting 12081 1726882389.41738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882389.43397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.43402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.43427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.43431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.43433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.43616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882389.43620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882389.43622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882389.43740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.45399: stdout chunk (state=3): >>>/root <<< 12081 1726882389.45502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882389.45588: stderr chunk (state=3): >>><<< 12081 1726882389.45590: stdout chunk (state=3): >>><<< 12081 1726882389.45673: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882389.45676: _low_level_execute_command(): starting 12081 1726882389.45682: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932 `" && echo ansible-tmp-1726882389.4560938-12477-241894565503932="` echo /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932 `" ) && sleep 0' 12081 1726882389.46515: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882389.46536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.46556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.46578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.46632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.46649: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882389.46672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.46693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882389.46705: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882389.46716: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882389.46729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.46743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.46764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.46782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.46794: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882389.46810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.46890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882389.46917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882389.46934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882389.47078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.49000: stdout chunk (state=3): >>>ansible-tmp-1726882389.4560938-12477-241894565503932=/root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932 <<< 12081 1726882389.49215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882389.49219: stdout chunk (state=3): >>><<< 12081 1726882389.49221: stderr chunk (state=3): >>><<< 12081 1726882389.49278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882389.4560938-12477-241894565503932=/root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882389.49472: variable 'ansible_module_compression' from source: unknown 12081 1726882389.49475: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882389.49477: variable 'ansible_facts' from source: unknown 12081 1726882389.49496: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932/AnsiballZ_stat.py 12081 1726882389.50196: Sending initial data 12081 1726882389.50206: Sent initial data (153 bytes) 12081 1726882389.52584: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882389.52685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.52704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.52722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.52768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.52781: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882389.52800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.52818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882389.52830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882389.52842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882389.52855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.52872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.52890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.52905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.52920: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882389.52933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.53093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882389.53118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882389.53140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882389.53279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.55133: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882389.55230: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882389.55330: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmphy1nw8vm /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932/AnsiballZ_stat.py <<< 12081 1726882389.55428: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882389.56987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882389.57170: stderr chunk (state=3): >>><<< 12081 1726882389.57173: stdout chunk (state=3): >>><<< 12081 1726882389.57176: done transferring module to remote 12081 1726882389.57178: _low_level_execute_command(): starting 12081 1726882389.57180: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932/ /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932/AnsiballZ_stat.py && sleep 0' 12081 1726882389.58765: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882389.58884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.58903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.58921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.58962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.58976: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882389.58992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.59008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882389.59018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882389.59028: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882389.59039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.59050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.59068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.59080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.59090: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882389.59105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.59181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882389.59282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882389.59296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882389.59445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.61287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882389.61290: stdout chunk (state=3): >>><<< 12081 1726882389.61297: stderr chunk (state=3): >>><<< 12081 1726882389.61394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882389.61397: _low_level_execute_command(): starting 12081 1726882389.61400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932/AnsiballZ_stat.py && sleep 0' 12081 1726882389.62815: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.62818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.62885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.62920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882389.62923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.62990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882389.62995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.63122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882389.63125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882389.63143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882389.63485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.76336: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26431, "dev": 21, "nlink": 1, "atime": 1726882388.0304806, "mtime": 1726882388.0304806, "ctime": 1726882388.0304806, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882389.77287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882389.77354: stderr chunk (state=3): >>><<< 12081 1726882389.77358: stdout chunk (state=3): >>><<< 12081 1726882389.77517: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26431, "dev": 21, "nlink": 1, "atime": 1726882388.0304806, "mtime": 1726882388.0304806, "ctime": 1726882388.0304806, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882389.77527: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882389.77530: _low_level_execute_command(): starting 12081 1726882389.77532: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882389.4560938-12477-241894565503932/ > /dev/null 2>&1 && sleep 0' 12081 1726882389.79045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882389.79060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.79075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.79092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.79133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.79143: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882389.79155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.79181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882389.79192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882389.79379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882389.79391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.79403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.79417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.79427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.79436: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882389.79448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.79527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882389.79548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882389.79566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882389.79698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882389.81655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882389.81658: stdout chunk (state=3): >>><<< 12081 1726882389.81660: stderr chunk (state=3): >>><<< 12081 1726882389.81872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882389.81875: handler run complete 12081 1726882389.81878: attempt loop complete, returning result 12081 1726882389.81880: _execute() done 12081 1726882389.81882: dumping result to json 12081 1726882389.81884: done dumping result, returning 12081 1726882389.81886: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0e448fcc-3ce9-0a3f-ff3c-00000000017b] 12081 1726882389.81888: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000017b ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882388.0304806, "block_size": 4096, "blocks": 0, "ctime": 1726882388.0304806, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26431, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882388.0304806, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12081 1726882389.82053: no more pending results, returning what we have 12081 1726882389.82056: results queue empty 12081 1726882389.82057: checking for any_errors_fatal 12081 1726882389.82058: done checking for any_errors_fatal 12081 1726882389.82059: checking for max_fail_percentage 12081 1726882389.82060: done checking for max_fail_percentage 12081 1726882389.82061: checking to see if all hosts have failed and the running result is not ok 12081 1726882389.82062: done checking to see if all hosts have failed 12081 1726882389.82064: getting the remaining hosts for this loop 12081 1726882389.82066: done getting the remaining hosts for this loop 12081 1726882389.82070: getting the next task for host managed_node3 12081 1726882389.82079: done getting next task for host managed_node3 12081 1726882389.82081: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12081 1726882389.82085: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882389.82093: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000017b 12081 1726882389.82101: getting variables 12081 1726882389.82112: in VariableManager get_vars() 12081 1726882389.82139: Calling all_inventory to load vars for managed_node3 12081 1726882389.82141: Calling groups_inventory to load vars for managed_node3 12081 1726882389.82144: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882389.82158: Calling all_plugins_play to load vars for managed_node3 12081 1726882389.82160: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882389.82166: Calling groups_plugins_play to load vars for managed_node3 12081 1726882389.82387: WORKER PROCESS EXITING 12081 1726882389.82411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882389.82623: done with get_vars() 12081 1726882389.82687: done getting variables 12081 1726882389.82902: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 12081 1726882389.83162: variable 'interface' from source: task vars 12081 1726882389.83168: variable 'dhcp_interface1' from source: play vars 12081 1726882389.83260: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:09 -0400 (0:00:00.477) 0:00:09.635 ****** 12081 1726882389.83377: entering _queue_task() for managed_node3/assert 12081 1726882389.83379: Creating lock for assert 12081 1726882389.83924: worker is 1 (out of 1 available) 12081 1726882389.83936: exiting _queue_task() for managed_node3/assert 12081 1726882389.84066: done queuing things up, now waiting for results queue to drain 12081 1726882389.84068: waiting for pending results... 12081 1726882389.85111: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 12081 1726882389.85243: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000011d 12081 1726882389.85267: variable 'ansible_search_path' from source: unknown 12081 1726882389.85275: variable 'ansible_search_path' from source: unknown 12081 1726882389.85318: calling self._execute() 12081 1726882389.85391: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.85402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.85415: variable 'omit' from source: magic vars 12081 1726882389.85740: variable 'ansible_distribution_major_version' from source: facts 12081 1726882389.86085: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882389.86097: variable 'omit' from source: magic vars 12081 1726882389.86160: variable 'omit' from source: magic vars 12081 1726882389.86265: variable 'interface' from source: task vars 12081 1726882389.86879: variable 'dhcp_interface1' from source: play vars 12081 1726882389.86952: variable 'dhcp_interface1' from source: play vars 12081 1726882389.86979: variable 'omit' from source: magic vars 12081 1726882389.87025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882389.87066: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882389.87093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882389.87114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882389.87128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882389.87161: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882389.87170: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.87177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.87281: Set connection var ansible_pipelining to False 12081 1726882389.87289: Set connection var ansible_shell_type to sh 12081 1726882389.87300: Set connection var ansible_shell_executable to /bin/sh 12081 1726882389.87306: Set connection var ansible_connection to ssh 12081 1726882389.87314: Set connection var ansible_timeout to 10 12081 1726882389.87323: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882389.87350: variable 'ansible_shell_executable' from source: unknown 12081 1726882389.87358: variable 'ansible_connection' from source: unknown 12081 1726882389.87366: variable 'ansible_module_compression' from source: unknown 12081 1726882389.87373: variable 'ansible_shell_type' from source: unknown 12081 1726882389.87378: variable 'ansible_shell_executable' from source: unknown 12081 1726882389.87384: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.87391: variable 'ansible_pipelining' from source: unknown 12081 1726882389.87397: variable 'ansible_timeout' from source: unknown 12081 1726882389.87403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.87537: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882389.87681: variable 'omit' from source: magic vars 12081 1726882389.87691: starting attempt loop 12081 1726882389.87697: running the handler 12081 1726882389.87838: variable 'interface_stat' from source: set_fact 12081 1726882389.88291: Evaluated conditional (interface_stat.stat.exists): True 12081 1726882389.88301: handler run complete 12081 1726882389.88318: attempt loop complete, returning result 12081 1726882389.88324: _execute() done 12081 1726882389.88330: dumping result to json 12081 1726882389.88337: done dumping result, returning 12081 1726882389.88350: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0e448fcc-3ce9-0a3f-ff3c-00000000011d] 12081 1726882389.88361: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000011d ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882389.88507: no more pending results, returning what we have 12081 1726882389.88509: results queue empty 12081 1726882389.88510: checking for any_errors_fatal 12081 1726882389.88517: done checking for any_errors_fatal 12081 1726882389.88517: checking for max_fail_percentage 12081 1726882389.88519: done checking for max_fail_percentage 12081 1726882389.88520: checking to see if all hosts have failed and the running result is not ok 12081 1726882389.88521: done checking to see if all hosts have failed 12081 1726882389.88522: getting the remaining hosts for this loop 12081 1726882389.88524: done getting the remaining hosts for this loop 12081 1726882389.88527: getting the next task for host managed_node3 12081 1726882389.88536: done getting next task for host managed_node3 12081 1726882389.88539: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12081 1726882389.88544: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882389.88547: getting variables 12081 1726882389.88548: in VariableManager get_vars() 12081 1726882389.88581: Calling all_inventory to load vars for managed_node3 12081 1726882389.88583: Calling groups_inventory to load vars for managed_node3 12081 1726882389.88586: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882389.88599: Calling all_plugins_play to load vars for managed_node3 12081 1726882389.88601: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882389.88604: Calling groups_plugins_play to load vars for managed_node3 12081 1726882389.88769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882389.88938: done with get_vars() 12081 1726882389.88953: done getting variables 12081 1726882389.88991: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000011d 12081 1726882389.88994: WORKER PROCESS EXITING TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:09 -0400 (0:00:00.057) 0:00:09.693 ****** 12081 1726882389.89080: entering _queue_task() for managed_node3/include_tasks 12081 1726882389.89346: worker is 1 (out of 1 available) 12081 1726882389.89360: exiting _queue_task() for managed_node3/include_tasks 12081 1726882389.89374: done queuing things up, now waiting for results queue to drain 12081 1726882389.89376: waiting for pending results... 12081 1726882389.90252: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12081 1726882389.90501: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000121 12081 1726882389.90587: variable 'ansible_search_path' from source: unknown 12081 1726882389.90624: variable 'ansible_search_path' from source: unknown 12081 1726882389.90688: calling self._execute() 12081 1726882389.91126: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.91138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.91179: variable 'omit' from source: magic vars 12081 1726882389.91926: variable 'ansible_distribution_major_version' from source: facts 12081 1726882389.92072: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882389.92084: _execute() done 12081 1726882389.92092: dumping result to json 12081 1726882389.92098: done dumping result, returning 12081 1726882389.92109: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000121] 12081 1726882389.92120: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000121 12081 1726882389.92231: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000121 12081 1726882389.92238: WORKER PROCESS EXITING 12081 1726882389.92279: no more pending results, returning what we have 12081 1726882389.92284: in VariableManager get_vars() 12081 1726882389.92395: Calling all_inventory to load vars for managed_node3 12081 1726882389.92398: Calling groups_inventory to load vars for managed_node3 12081 1726882389.92401: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882389.92415: Calling all_plugins_play to load vars for managed_node3 12081 1726882389.92417: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882389.92419: Calling groups_plugins_play to load vars for managed_node3 12081 1726882389.92583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882389.92771: done with get_vars() 12081 1726882389.92781: variable 'ansible_search_path' from source: unknown 12081 1726882389.92783: variable 'ansible_search_path' from source: unknown 12081 1726882389.92817: we have included files to process 12081 1726882389.92818: generating all_blocks data 12081 1726882389.92820: done generating all_blocks data 12081 1726882389.92825: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882389.92826: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882389.92828: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882389.93027: done processing included file 12081 1726882389.93029: iterating over new_blocks loaded from include file 12081 1726882389.93031: in VariableManager get_vars() 12081 1726882389.93046: done with get_vars() 12081 1726882389.93048: filtering new block on tags 12081 1726882389.93082: done filtering new block on tags 12081 1726882389.93085: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12081 1726882389.93090: extending task lists for all hosts with included blocks 12081 1726882389.93426: done extending task lists 12081 1726882389.93428: done processing included files 12081 1726882389.93429: results queue empty 12081 1726882389.93430: checking for any_errors_fatal 12081 1726882389.93433: done checking for any_errors_fatal 12081 1726882389.93433: checking for max_fail_percentage 12081 1726882389.93435: done checking for max_fail_percentage 12081 1726882389.93435: checking to see if all hosts have failed and the running result is not ok 12081 1726882389.93436: done checking to see if all hosts have failed 12081 1726882389.93437: getting the remaining hosts for this loop 12081 1726882389.93438: done getting the remaining hosts for this loop 12081 1726882389.93442: getting the next task for host managed_node3 12081 1726882389.93446: done getting next task for host managed_node3 12081 1726882389.93448: ^ task is: TASK: Get stat for interface {{ interface }} 12081 1726882389.93455: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882389.93458: getting variables 12081 1726882389.93459: in VariableManager get_vars() 12081 1726882389.93469: Calling all_inventory to load vars for managed_node3 12081 1726882389.93472: Calling groups_inventory to load vars for managed_node3 12081 1726882389.93474: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882389.93479: Calling all_plugins_play to load vars for managed_node3 12081 1726882389.93481: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882389.93484: Calling groups_plugins_play to load vars for managed_node3 12081 1726882389.93739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882389.94312: done with get_vars() 12081 1726882389.94320: done getting variables 12081 1726882389.94591: variable 'interface' from source: task vars 12081 1726882389.94595: variable 'dhcp_interface2' from source: play vars 12081 1726882389.94723: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:09 -0400 (0:00:00.057) 0:00:09.750 ****** 12081 1726882389.94790: entering _queue_task() for managed_node3/stat 12081 1726882389.95133: worker is 1 (out of 1 available) 12081 1726882389.95145: exiting _queue_task() for managed_node3/stat 12081 1726882389.95160: done queuing things up, now waiting for results queue to drain 12081 1726882389.95166: waiting for pending results... 12081 1726882389.95481: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 12081 1726882389.95676: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000019f 12081 1726882389.95705: variable 'ansible_search_path' from source: unknown 12081 1726882389.95726: variable 'ansible_search_path' from source: unknown 12081 1726882389.95784: calling self._execute() 12081 1726882389.95872: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.95884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.95898: variable 'omit' from source: magic vars 12081 1726882389.96304: variable 'ansible_distribution_major_version' from source: facts 12081 1726882389.96323: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882389.96340: variable 'omit' from source: magic vars 12081 1726882389.96427: variable 'omit' from source: magic vars 12081 1726882389.96559: variable 'interface' from source: task vars 12081 1726882389.96570: variable 'dhcp_interface2' from source: play vars 12081 1726882389.96780: variable 'dhcp_interface2' from source: play vars 12081 1726882389.96803: variable 'omit' from source: magic vars 12081 1726882389.96858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882389.96901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882389.96931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882389.96970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882389.96993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882389.97040: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882389.97067: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.97083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.97212: Set connection var ansible_pipelining to False 12081 1726882389.97221: Set connection var ansible_shell_type to sh 12081 1726882389.97233: Set connection var ansible_shell_executable to /bin/sh 12081 1726882389.97288: Set connection var ansible_connection to ssh 12081 1726882389.97308: Set connection var ansible_timeout to 10 12081 1726882389.97320: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882389.97347: variable 'ansible_shell_executable' from source: unknown 12081 1726882389.97386: variable 'ansible_connection' from source: unknown 12081 1726882389.97404: variable 'ansible_module_compression' from source: unknown 12081 1726882389.97412: variable 'ansible_shell_type' from source: unknown 12081 1726882389.97478: variable 'ansible_shell_executable' from source: unknown 12081 1726882389.97487: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882389.97497: variable 'ansible_pipelining' from source: unknown 12081 1726882389.97510: variable 'ansible_timeout' from source: unknown 12081 1726882389.97518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882389.98002: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882389.98017: variable 'omit' from source: magic vars 12081 1726882389.98027: starting attempt loop 12081 1726882389.98035: running the handler 12081 1726882389.98077: _low_level_execute_command(): starting 12081 1726882389.98130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882389.99648: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882389.99758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.99783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882389.99804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882389.99849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882389.99868: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882389.99889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882389.99909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882389.99921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882389.99933: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882389.99945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882389.99966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.00013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.00027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.00039: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.00057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.00257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.00278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.00293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.00509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.02155: stdout chunk (state=3): >>>/root <<< 12081 1726882390.02272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882390.02339: stderr chunk (state=3): >>><<< 12081 1726882390.02342: stdout chunk (state=3): >>><<< 12081 1726882390.02379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882390.02394: _low_level_execute_command(): starting 12081 1726882390.02401: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014 `" && echo ansible-tmp-1726882390.0237844-12498-25064947368014="` echo /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014 `" ) && sleep 0' 12081 1726882390.04645: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882390.04667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.04671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.04694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.04743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.04799: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882390.04812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.04836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882390.04839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882390.04855: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882390.04858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.04871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.04885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.04893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.04905: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.04914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.04987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.05088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.05102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.05244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.07237: stdout chunk (state=3): >>>ansible-tmp-1726882390.0237844-12498-25064947368014=/root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014 <<< 12081 1726882390.07383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882390.07463: stderr chunk (state=3): >>><<< 12081 1726882390.07468: stdout chunk (state=3): >>><<< 12081 1726882390.07673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882390.0237844-12498-25064947368014=/root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882390.07676: variable 'ansible_module_compression' from source: unknown 12081 1726882390.07679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882390.07681: variable 'ansible_facts' from source: unknown 12081 1726882390.07754: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014/AnsiballZ_stat.py 12081 1726882390.07954: Sending initial data 12081 1726882390.07957: Sent initial data (152 bytes) 12081 1726882390.09285: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882390.09302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.09319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.09341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.09396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.09409: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882390.09422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.09438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882390.09448: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882390.09472: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882390.09485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.09498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.09512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.09523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.09532: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.09545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.09629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.09651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.09673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.09853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.11553: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882390.11649: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882390.11771: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpse3xyiz7 /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014/AnsiballZ_stat.py <<< 12081 1726882390.11880: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882390.13270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882390.13390: stderr chunk (state=3): >>><<< 12081 1726882390.13393: stdout chunk (state=3): >>><<< 12081 1726882390.13409: done transferring module to remote 12081 1726882390.13420: _low_level_execute_command(): starting 12081 1726882390.13427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014/ /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014/AnsiballZ_stat.py && sleep 0' 12081 1726882390.14148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882390.14154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.14174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.14177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.14214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.14217: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882390.14225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.14245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882390.14248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882390.14254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882390.14256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.14267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.14280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.14287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.14294: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.14303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.14372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.14398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.14401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.14522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.16278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882390.16331: stderr chunk (state=3): >>><<< 12081 1726882390.16336: stdout chunk (state=3): >>><<< 12081 1726882390.16367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882390.16371: _low_level_execute_command(): starting 12081 1726882390.16373: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014/AnsiballZ_stat.py && sleep 0' 12081 1726882390.17034: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882390.17038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.17040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.17042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.17291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.17298: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882390.17302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.17304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882390.17306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882390.17308: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882390.17310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.17312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.17314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.17317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.17322: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.17324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.17326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.17327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.17329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.17411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.30401: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27017, "dev": 21, "nlink": 1, "atime": 1726882388.035317, "mtime": 1726882388.035317, "ctime": 1726882388.035317, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882390.31399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882390.31403: stdout chunk (state=3): >>><<< 12081 1726882390.31408: stderr chunk (state=3): >>><<< 12081 1726882390.31426: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27017, "dev": 21, "nlink": 1, "atime": 1726882388.035317, "mtime": 1726882388.035317, "ctime": 1726882388.035317, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882390.31489: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882390.31496: _low_level_execute_command(): starting 12081 1726882390.31501: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882390.0237844-12498-25064947368014/ > /dev/null 2>&1 && sleep 0' 12081 1726882390.32136: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882390.32142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.32153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.32171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.32210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.32218: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882390.32229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.32243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882390.32250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882390.32260: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882390.32274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.32283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.32294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.32301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.32307: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.32316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.32395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.32410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.32413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.32544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.34460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882390.34466: stdout chunk (state=3): >>><<< 12081 1726882390.34469: stderr chunk (state=3): >>><<< 12081 1726882390.34879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882390.34882: handler run complete 12081 1726882390.34885: attempt loop complete, returning result 12081 1726882390.34888: _execute() done 12081 1726882390.34890: dumping result to json 12081 1726882390.34892: done dumping result, returning 12081 1726882390.34895: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0e448fcc-3ce9-0a3f-ff3c-00000000019f] 12081 1726882390.34897: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000019f 12081 1726882390.34979: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000019f 12081 1726882390.34983: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882388.035317, "block_size": 4096, "blocks": 0, "ctime": 1726882388.035317, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27017, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882388.035317, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12081 1726882390.35105: no more pending results, returning what we have 12081 1726882390.35108: results queue empty 12081 1726882390.35108: checking for any_errors_fatal 12081 1726882390.35110: done checking for any_errors_fatal 12081 1726882390.35110: checking for max_fail_percentage 12081 1726882390.35112: done checking for max_fail_percentage 12081 1726882390.35112: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.35113: done checking to see if all hosts have failed 12081 1726882390.35114: getting the remaining hosts for this loop 12081 1726882390.35115: done getting the remaining hosts for this loop 12081 1726882390.35119: getting the next task for host managed_node3 12081 1726882390.35125: done getting next task for host managed_node3 12081 1726882390.35128: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12081 1726882390.35131: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.35139: getting variables 12081 1726882390.35140: in VariableManager get_vars() 12081 1726882390.35170: Calling all_inventory to load vars for managed_node3 12081 1726882390.35173: Calling groups_inventory to load vars for managed_node3 12081 1726882390.35176: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.35187: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.35189: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.35191: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.35371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.35590: done with get_vars() 12081 1726882390.35605: done getting variables 12081 1726882390.35666: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882390.35795: variable 'interface' from source: task vars 12081 1726882390.35799: variable 'dhcp_interface2' from source: play vars 12081 1726882390.35868: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:10 -0400 (0:00:00.411) 0:00:10.161 ****** 12081 1726882390.35901: entering _queue_task() for managed_node3/assert 12081 1726882390.36197: worker is 1 (out of 1 available) 12081 1726882390.36209: exiting _queue_task() for managed_node3/assert 12081 1726882390.36220: done queuing things up, now waiting for results queue to drain 12081 1726882390.36221: waiting for pending results... 12081 1726882390.36497: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 12081 1726882390.36614: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000122 12081 1726882390.36626: variable 'ansible_search_path' from source: unknown 12081 1726882390.36630: variable 'ansible_search_path' from source: unknown 12081 1726882390.36673: calling self._execute() 12081 1726882390.36748: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.36752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.36766: variable 'omit' from source: magic vars 12081 1726882390.37194: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.37205: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.37211: variable 'omit' from source: magic vars 12081 1726882390.37283: variable 'omit' from source: magic vars 12081 1726882390.37382: variable 'interface' from source: task vars 12081 1726882390.37386: variable 'dhcp_interface2' from source: play vars 12081 1726882390.37451: variable 'dhcp_interface2' from source: play vars 12081 1726882390.37473: variable 'omit' from source: magic vars 12081 1726882390.37515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882390.37549: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882390.37574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882390.37590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882390.37601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882390.37628: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882390.37631: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.37635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.37736: Set connection var ansible_pipelining to False 12081 1726882390.37739: Set connection var ansible_shell_type to sh 12081 1726882390.37746: Set connection var ansible_shell_executable to /bin/sh 12081 1726882390.37749: Set connection var ansible_connection to ssh 12081 1726882390.37761: Set connection var ansible_timeout to 10 12081 1726882390.37768: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882390.37792: variable 'ansible_shell_executable' from source: unknown 12081 1726882390.37795: variable 'ansible_connection' from source: unknown 12081 1726882390.37798: variable 'ansible_module_compression' from source: unknown 12081 1726882390.37800: variable 'ansible_shell_type' from source: unknown 12081 1726882390.37803: variable 'ansible_shell_executable' from source: unknown 12081 1726882390.37805: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.37807: variable 'ansible_pipelining' from source: unknown 12081 1726882390.37810: variable 'ansible_timeout' from source: unknown 12081 1726882390.37815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.37947: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882390.37959: variable 'omit' from source: magic vars 12081 1726882390.37966: starting attempt loop 12081 1726882390.37972: running the handler 12081 1726882390.38107: variable 'interface_stat' from source: set_fact 12081 1726882390.38126: Evaluated conditional (interface_stat.stat.exists): True 12081 1726882390.38131: handler run complete 12081 1726882390.38146: attempt loop complete, returning result 12081 1726882390.38149: _execute() done 12081 1726882390.38151: dumping result to json 12081 1726882390.38156: done dumping result, returning 12081 1726882390.38166: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0e448fcc-3ce9-0a3f-ff3c-000000000122] 12081 1726882390.38173: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000122 12081 1726882390.38258: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000122 12081 1726882390.38261: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882390.38337: no more pending results, returning what we have 12081 1726882390.38340: results queue empty 12081 1726882390.38341: checking for any_errors_fatal 12081 1726882390.38350: done checking for any_errors_fatal 12081 1726882390.38351: checking for max_fail_percentage 12081 1726882390.38353: done checking for max_fail_percentage 12081 1726882390.38354: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.38355: done checking to see if all hosts have failed 12081 1726882390.38356: getting the remaining hosts for this loop 12081 1726882390.38357: done getting the remaining hosts for this loop 12081 1726882390.38361: getting the next task for host managed_node3 12081 1726882390.38371: done getting next task for host managed_node3 12081 1726882390.38374: ^ task is: TASK: Test 12081 1726882390.38377: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.38381: getting variables 12081 1726882390.38382: in VariableManager get_vars() 12081 1726882390.38408: Calling all_inventory to load vars for managed_node3 12081 1726882390.38411: Calling groups_inventory to load vars for managed_node3 12081 1726882390.38415: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.38424: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.38427: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.38429: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.38643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.38840: done with get_vars() 12081 1726882390.38850: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:33:10 -0400 (0:00:00.030) 0:00:10.192 ****** 12081 1726882390.38937: entering _queue_task() for managed_node3/include_tasks 12081 1726882390.39197: worker is 1 (out of 1 available) 12081 1726882390.39210: exiting _queue_task() for managed_node3/include_tasks 12081 1726882390.39224: done queuing things up, now waiting for results queue to drain 12081 1726882390.39225: waiting for pending results... 12081 1726882390.39471: running TaskExecutor() for managed_node3/TASK: Test 12081 1726882390.39583: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000008c 12081 1726882390.39603: variable 'ansible_search_path' from source: unknown 12081 1726882390.39611: variable 'ansible_search_path' from source: unknown 12081 1726882390.39665: variable 'lsr_test' from source: include params 12081 1726882390.39864: variable 'lsr_test' from source: include params 12081 1726882390.39938: variable 'omit' from source: magic vars 12081 1726882390.40062: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.40078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.40094: variable 'omit' from source: magic vars 12081 1726882390.40332: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.40348: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.40358: variable 'item' from source: unknown 12081 1726882390.40424: variable 'item' from source: unknown 12081 1726882390.40465: variable 'item' from source: unknown 12081 1726882390.40529: variable 'item' from source: unknown 12081 1726882390.40678: dumping result to json 12081 1726882390.40687: done dumping result, returning 12081 1726882390.40697: done running TaskExecutor() for managed_node3/TASK: Test [0e448fcc-3ce9-0a3f-ff3c-00000000008c] 12081 1726882390.40709: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008c 12081 1726882390.40797: no more pending results, returning what we have 12081 1726882390.40803: in VariableManager get_vars() 12081 1726882390.40838: Calling all_inventory to load vars for managed_node3 12081 1726882390.40841: Calling groups_inventory to load vars for managed_node3 12081 1726882390.40845: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.40862: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.40867: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.40871: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.41060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.41250: done with get_vars() 12081 1726882390.41258: variable 'ansible_search_path' from source: unknown 12081 1726882390.41259: variable 'ansible_search_path' from source: unknown 12081 1726882390.41276: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008c 12081 1726882390.41310: we have included files to process 12081 1726882390.41311: generating all_blocks data 12081 1726882390.41313: done generating all_blocks data 12081 1726882390.41318: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 12081 1726882390.41320: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 12081 1726882390.41323: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 12081 1726882390.42236: WORKER PROCESS EXITING 12081 1726882390.42300: done processing included file 12081 1726882390.42302: iterating over new_blocks loaded from include file 12081 1726882390.42303: in VariableManager get_vars() 12081 1726882390.42317: done with get_vars() 12081 1726882390.42318: filtering new block on tags 12081 1726882390.42350: done filtering new block on tags 12081 1726882390.42352: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml for managed_node3 => (item=tasks/create_bond_profile.yml) 12081 1726882390.42357: extending task lists for all hosts with included blocks 12081 1726882390.43407: done extending task lists 12081 1726882390.43408: done processing included files 12081 1726882390.43408: results queue empty 12081 1726882390.43409: checking for any_errors_fatal 12081 1726882390.43411: done checking for any_errors_fatal 12081 1726882390.43411: checking for max_fail_percentage 12081 1726882390.43412: done checking for max_fail_percentage 12081 1726882390.43413: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.43413: done checking to see if all hosts have failed 12081 1726882390.43414: getting the remaining hosts for this loop 12081 1726882390.43415: done getting the remaining hosts for this loop 12081 1726882390.43416: getting the next task for host managed_node3 12081 1726882390.43419: done getting next task for host managed_node3 12081 1726882390.43420: ^ task is: TASK: Include network role 12081 1726882390.43422: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.43424: getting variables 12081 1726882390.43425: in VariableManager get_vars() 12081 1726882390.43432: Calling all_inventory to load vars for managed_node3 12081 1726882390.43433: Calling groups_inventory to load vars for managed_node3 12081 1726882390.43435: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.43439: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.43440: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.43442: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.43528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.43643: done with get_vars() 12081 1726882390.43649: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:3 Friday 20 September 2024 21:33:10 -0400 (0:00:00.047) 0:00:10.239 ****** 12081 1726882390.43701: entering _queue_task() for managed_node3/include_role 12081 1726882390.43702: Creating lock for include_role 12081 1726882390.43913: worker is 1 (out of 1 available) 12081 1726882390.43924: exiting _queue_task() for managed_node3/include_role 12081 1726882390.43936: done queuing things up, now waiting for results queue to drain 12081 1726882390.43937: waiting for pending results... 12081 1726882390.44100: running TaskExecutor() for managed_node3/TASK: Include network role 12081 1726882390.44172: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000001c5 12081 1726882390.44189: variable 'ansible_search_path' from source: unknown 12081 1726882390.44192: variable 'ansible_search_path' from source: unknown 12081 1726882390.44218: calling self._execute() 12081 1726882390.44282: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.44287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.44296: variable 'omit' from source: magic vars 12081 1726882390.44549: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.44561: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.44568: _execute() done 12081 1726882390.44571: dumping result to json 12081 1726882390.44574: done dumping result, returning 12081 1726882390.44580: done running TaskExecutor() for managed_node3/TASK: Include network role [0e448fcc-3ce9-0a3f-ff3c-0000000001c5] 12081 1726882390.44587: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000001c5 12081 1726882390.44724: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000001c5 12081 1726882390.44728: WORKER PROCESS EXITING 12081 1726882390.44759: no more pending results, returning what we have 12081 1726882390.44766: in VariableManager get_vars() 12081 1726882390.44829: Calling all_inventory to load vars for managed_node3 12081 1726882390.44832: Calling groups_inventory to load vars for managed_node3 12081 1726882390.44836: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.44854: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.44857: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.44860: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.45061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.45267: done with get_vars() 12081 1726882390.45279: variable 'ansible_search_path' from source: unknown 12081 1726882390.45280: variable 'ansible_search_path' from source: unknown 12081 1726882390.45473: variable 'omit' from source: magic vars 12081 1726882390.45518: variable 'omit' from source: magic vars 12081 1726882390.45533: variable 'omit' from source: magic vars 12081 1726882390.45537: we have included files to process 12081 1726882390.45538: generating all_blocks data 12081 1726882390.45539: done generating all_blocks data 12081 1726882390.45540: processing included file: fedora.linux_system_roles.network 12081 1726882390.45561: in VariableManager get_vars() 12081 1726882390.45573: done with get_vars() 12081 1726882390.45646: in VariableManager get_vars() 12081 1726882390.45667: done with get_vars() 12081 1726882390.45722: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12081 1726882390.45998: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12081 1726882390.46140: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12081 1726882390.46577: in VariableManager get_vars() 12081 1726882390.46591: done with get_vars() 12081 1726882390.46880: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882390.48009: iterating over new_blocks loaded from include file 12081 1726882390.48011: in VariableManager get_vars() 12081 1726882390.48024: done with get_vars() 12081 1726882390.48025: filtering new block on tags 12081 1726882390.48244: done filtering new block on tags 12081 1726882390.48247: in VariableManager get_vars() 12081 1726882390.48262: done with get_vars() 12081 1726882390.48266: filtering new block on tags 12081 1726882390.48283: done filtering new block on tags 12081 1726882390.48285: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 12081 1726882390.48290: extending task lists for all hosts with included blocks 12081 1726882390.48445: done extending task lists 12081 1726882390.48446: done processing included files 12081 1726882390.48447: results queue empty 12081 1726882390.48448: checking for any_errors_fatal 12081 1726882390.48451: done checking for any_errors_fatal 12081 1726882390.48452: checking for max_fail_percentage 12081 1726882390.48453: done checking for max_fail_percentage 12081 1726882390.48454: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.48455: done checking to see if all hosts have failed 12081 1726882390.48456: getting the remaining hosts for this loop 12081 1726882390.48458: done getting the remaining hosts for this loop 12081 1726882390.48461: getting the next task for host managed_node3 12081 1726882390.48467: done getting next task for host managed_node3 12081 1726882390.48470: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882390.48473: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.48483: getting variables 12081 1726882390.48484: in VariableManager get_vars() 12081 1726882390.48496: Calling all_inventory to load vars for managed_node3 12081 1726882390.48499: Calling groups_inventory to load vars for managed_node3 12081 1726882390.48501: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.48505: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.48507: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.48510: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.48668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.48857: done with get_vars() 12081 1726882390.48869: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:10 -0400 (0:00:00.052) 0:00:10.292 ****** 12081 1726882390.48939: entering _queue_task() for managed_node3/include_tasks 12081 1726882390.49580: worker is 1 (out of 1 available) 12081 1726882390.49591: exiting _queue_task() for managed_node3/include_tasks 12081 1726882390.49602: done queuing things up, now waiting for results queue to drain 12081 1726882390.49604: waiting for pending results... 12081 1726882390.49867: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882390.50015: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000277 12081 1726882390.50036: variable 'ansible_search_path' from source: unknown 12081 1726882390.50049: variable 'ansible_search_path' from source: unknown 12081 1726882390.50091: calling self._execute() 12081 1726882390.50180: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.50193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.50208: variable 'omit' from source: magic vars 12081 1726882390.50570: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.50592: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.50604: _execute() done 12081 1726882390.50612: dumping result to json 12081 1726882390.50620: done dumping result, returning 12081 1726882390.50631: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-0a3f-ff3c-000000000277] 12081 1726882390.50642: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000277 12081 1726882390.50804: no more pending results, returning what we have 12081 1726882390.50809: in VariableManager get_vars() 12081 1726882390.50854: Calling all_inventory to load vars for managed_node3 12081 1726882390.50857: Calling groups_inventory to load vars for managed_node3 12081 1726882390.50859: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.50875: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.50879: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.50882: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.51146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.51411: done with get_vars() 12081 1726882390.51418: variable 'ansible_search_path' from source: unknown 12081 1726882390.51419: variable 'ansible_search_path' from source: unknown 12081 1726882390.51432: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000277 12081 1726882390.51435: WORKER PROCESS EXITING 12081 1726882390.51467: we have included files to process 12081 1726882390.51468: generating all_blocks data 12081 1726882390.51469: done generating all_blocks data 12081 1726882390.51473: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882390.51474: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882390.51476: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882390.51966: done processing included file 12081 1726882390.51967: iterating over new_blocks loaded from include file 12081 1726882390.51968: in VariableManager get_vars() 12081 1726882390.51983: done with get_vars() 12081 1726882390.51984: filtering new block on tags 12081 1726882390.52002: done filtering new block on tags 12081 1726882390.52004: in VariableManager get_vars() 12081 1726882390.52018: done with get_vars() 12081 1726882390.52019: filtering new block on tags 12081 1726882390.52046: done filtering new block on tags 12081 1726882390.52048: in VariableManager get_vars() 12081 1726882390.52061: done with get_vars() 12081 1726882390.52062: filtering new block on tags 12081 1726882390.52089: done filtering new block on tags 12081 1726882390.52091: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12081 1726882390.52094: extending task lists for all hosts with included blocks 12081 1726882390.53360: done extending task lists 12081 1726882390.53362: done processing included files 12081 1726882390.53363: results queue empty 12081 1726882390.53365: checking for any_errors_fatal 12081 1726882390.53368: done checking for any_errors_fatal 12081 1726882390.53369: checking for max_fail_percentage 12081 1726882390.53370: done checking for max_fail_percentage 12081 1726882390.53371: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.53372: done checking to see if all hosts have failed 12081 1726882390.53372: getting the remaining hosts for this loop 12081 1726882390.53374: done getting the remaining hosts for this loop 12081 1726882390.53377: getting the next task for host managed_node3 12081 1726882390.53381: done getting next task for host managed_node3 12081 1726882390.53384: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882390.53388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.53396: getting variables 12081 1726882390.53397: in VariableManager get_vars() 12081 1726882390.53412: Calling all_inventory to load vars for managed_node3 12081 1726882390.53414: Calling groups_inventory to load vars for managed_node3 12081 1726882390.53416: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.53421: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.53424: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.53426: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.53596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.53806: done with get_vars() 12081 1726882390.53815: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:10 -0400 (0:00:00.049) 0:00:10.341 ****** 12081 1726882390.53885: entering _queue_task() for managed_node3/setup 12081 1726882390.55352: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882390.55403: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000002d4 12081 1726882390.55406: variable 'ansible_search_path' from source: unknown 12081 1726882390.55410: variable 'ansible_search_path' from source: unknown 12081 1726882390.55413: calling self._execute() 12081 1726882390.55415: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.55417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.55420: variable 'omit' from source: magic vars 12081 1726882390.55394: worker is 1 (out of 1 available) 12081 1726882390.55432: exiting _queue_task() for managed_node3/setup 12081 1726882390.55441: done queuing things up, now waiting for results queue to drain 12081 1726882390.55442: waiting for pending results... 12081 1726882390.56119: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.56136: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.56346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882390.58816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882390.58880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882390.58916: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882390.58950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882390.58981: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882390.59303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882390.59338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882390.59371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882390.59419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882390.59438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882390.59496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882390.59525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882390.59553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882390.59600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882390.59620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882390.59781: variable '__network_required_facts' from source: role '' defaults 12081 1726882390.60375: variable 'ansible_facts' from source: unknown 12081 1726882390.60461: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12081 1726882390.60474: when evaluation is False, skipping this task 12081 1726882390.60481: _execute() done 12081 1726882390.60488: dumping result to json 12081 1726882390.60495: done dumping result, returning 12081 1726882390.60507: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-0a3f-ff3c-0000000002d4] 12081 1726882390.60518: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d4 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882390.60669: no more pending results, returning what we have 12081 1726882390.60673: results queue empty 12081 1726882390.60675: checking for any_errors_fatal 12081 1726882390.60676: done checking for any_errors_fatal 12081 1726882390.60677: checking for max_fail_percentage 12081 1726882390.60678: done checking for max_fail_percentage 12081 1726882390.60679: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.60680: done checking to see if all hosts have failed 12081 1726882390.60681: getting the remaining hosts for this loop 12081 1726882390.60683: done getting the remaining hosts for this loop 12081 1726882390.60686: getting the next task for host managed_node3 12081 1726882390.60697: done getting next task for host managed_node3 12081 1726882390.60700: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882390.60705: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.60722: getting variables 12081 1726882390.60725: in VariableManager get_vars() 12081 1726882390.60764: Calling all_inventory to load vars for managed_node3 12081 1726882390.60767: Calling groups_inventory to load vars for managed_node3 12081 1726882390.60769: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.60781: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.60783: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.60787: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.61028: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d4 12081 1726882390.61049: WORKER PROCESS EXITING 12081 1726882390.61067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.61371: done with get_vars() 12081 1726882390.61389: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:10 -0400 (0:00:00.076) 0:00:10.418 ****** 12081 1726882390.61508: entering _queue_task() for managed_node3/stat 12081 1726882390.61799: worker is 1 (out of 1 available) 12081 1726882390.61818: exiting _queue_task() for managed_node3/stat 12081 1726882390.61830: done queuing things up, now waiting for results queue to drain 12081 1726882390.61831: waiting for pending results... 12081 1726882390.62140: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882390.62301: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000002d6 12081 1726882390.62314: variable 'ansible_search_path' from source: unknown 12081 1726882390.62317: variable 'ansible_search_path' from source: unknown 12081 1726882390.62360: calling self._execute() 12081 1726882390.62450: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.62457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.62473: variable 'omit' from source: magic vars 12081 1726882390.62875: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.62888: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.63089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882390.63419: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882390.63477: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882390.63519: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882390.63552: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882390.63667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882390.63705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882390.63732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882390.63763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882390.63874: variable '__network_is_ostree' from source: set_fact 12081 1726882390.63881: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882390.63884: when evaluation is False, skipping this task 12081 1726882390.63891: _execute() done 12081 1726882390.63894: dumping result to json 12081 1726882390.63902: done dumping result, returning 12081 1726882390.63911: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-0a3f-ff3c-0000000002d6] 12081 1726882390.63924: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d6 12081 1726882390.64017: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d6 12081 1726882390.64020: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882390.64083: no more pending results, returning what we have 12081 1726882390.64088: results queue empty 12081 1726882390.64089: checking for any_errors_fatal 12081 1726882390.64097: done checking for any_errors_fatal 12081 1726882390.64098: checking for max_fail_percentage 12081 1726882390.64101: done checking for max_fail_percentage 12081 1726882390.64102: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.64104: done checking to see if all hosts have failed 12081 1726882390.64105: getting the remaining hosts for this loop 12081 1726882390.64107: done getting the remaining hosts for this loop 12081 1726882390.64113: getting the next task for host managed_node3 12081 1726882390.64123: done getting next task for host managed_node3 12081 1726882390.64127: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882390.64134: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.64153: getting variables 12081 1726882390.64155: in VariableManager get_vars() 12081 1726882390.64197: Calling all_inventory to load vars for managed_node3 12081 1726882390.64201: Calling groups_inventory to load vars for managed_node3 12081 1726882390.64203: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.64216: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.64219: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.64222: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.64421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.64630: done with get_vars() 12081 1726882390.64642: done getting variables 12081 1726882390.64829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:10 -0400 (0:00:00.034) 0:00:10.452 ****** 12081 1726882390.64925: entering _queue_task() for managed_node3/set_fact 12081 1726882390.65277: worker is 1 (out of 1 available) 12081 1726882390.65290: exiting _queue_task() for managed_node3/set_fact 12081 1726882390.65303: done queuing things up, now waiting for results queue to drain 12081 1726882390.65304: waiting for pending results... 12081 1726882390.65600: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882390.65729: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000002d7 12081 1726882390.65742: variable 'ansible_search_path' from source: unknown 12081 1726882390.65747: variable 'ansible_search_path' from source: unknown 12081 1726882390.65798: calling self._execute() 12081 1726882390.65881: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.65886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.65902: variable 'omit' from source: magic vars 12081 1726882390.66260: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.66275: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.66745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882390.67027: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882390.67081: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882390.67123: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882390.67157: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882390.67253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882390.67288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882390.67313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882390.67345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882390.67442: variable '__network_is_ostree' from source: set_fact 12081 1726882390.67450: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882390.67452: when evaluation is False, skipping this task 12081 1726882390.67458: _execute() done 12081 1726882390.67461: dumping result to json 12081 1726882390.67466: done dumping result, returning 12081 1726882390.67476: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-0a3f-ff3c-0000000002d7] 12081 1726882390.67482: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d7 12081 1726882390.67577: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d7 12081 1726882390.67580: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882390.67639: no more pending results, returning what we have 12081 1726882390.67643: results queue empty 12081 1726882390.67644: checking for any_errors_fatal 12081 1726882390.67648: done checking for any_errors_fatal 12081 1726882390.67649: checking for max_fail_percentage 12081 1726882390.67653: done checking for max_fail_percentage 12081 1726882390.67654: checking to see if all hosts have failed and the running result is not ok 12081 1726882390.67656: done checking to see if all hosts have failed 12081 1726882390.67656: getting the remaining hosts for this loop 12081 1726882390.67658: done getting the remaining hosts for this loop 12081 1726882390.67663: getting the next task for host managed_node3 12081 1726882390.67676: done getting next task for host managed_node3 12081 1726882390.67680: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882390.67686: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882390.67701: getting variables 12081 1726882390.67703: in VariableManager get_vars() 12081 1726882390.67745: Calling all_inventory to load vars for managed_node3 12081 1726882390.67748: Calling groups_inventory to load vars for managed_node3 12081 1726882390.67754: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882390.67768: Calling all_plugins_play to load vars for managed_node3 12081 1726882390.67772: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882390.67776: Calling groups_plugins_play to load vars for managed_node3 12081 1726882390.68204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882390.68680: done with get_vars() 12081 1726882390.68692: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:10 -0400 (0:00:00.038) 0:00:10.491 ****** 12081 1726882390.68808: entering _queue_task() for managed_node3/service_facts 12081 1726882390.68810: Creating lock for service_facts 12081 1726882390.69099: worker is 1 (out of 1 available) 12081 1726882390.69111: exiting _queue_task() for managed_node3/service_facts 12081 1726882390.69122: done queuing things up, now waiting for results queue to drain 12081 1726882390.69123: waiting for pending results... 12081 1726882390.69399: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882390.69545: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000002d9 12081 1726882390.69561: variable 'ansible_search_path' from source: unknown 12081 1726882390.69567: variable 'ansible_search_path' from source: unknown 12081 1726882390.69609: calling self._execute() 12081 1726882390.69689: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.69695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.69704: variable 'omit' from source: magic vars 12081 1726882390.70106: variable 'ansible_distribution_major_version' from source: facts 12081 1726882390.70123: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882390.70129: variable 'omit' from source: magic vars 12081 1726882390.70243: variable 'omit' from source: magic vars 12081 1726882390.70293: variable 'omit' from source: magic vars 12081 1726882390.70338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882390.70395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882390.70420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882390.70436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882390.70453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882390.70503: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882390.70507: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.70509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.70668: Set connection var ansible_pipelining to False 12081 1726882390.70671: Set connection var ansible_shell_type to sh 12081 1726882390.70679: Set connection var ansible_shell_executable to /bin/sh 12081 1726882390.70682: Set connection var ansible_connection to ssh 12081 1726882390.70692: Set connection var ansible_timeout to 10 12081 1726882390.70701: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882390.70733: variable 'ansible_shell_executable' from source: unknown 12081 1726882390.70737: variable 'ansible_connection' from source: unknown 12081 1726882390.70740: variable 'ansible_module_compression' from source: unknown 12081 1726882390.70743: variable 'ansible_shell_type' from source: unknown 12081 1726882390.70745: variable 'ansible_shell_executable' from source: unknown 12081 1726882390.70748: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882390.70750: variable 'ansible_pipelining' from source: unknown 12081 1726882390.70753: variable 'ansible_timeout' from source: unknown 12081 1726882390.70758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882390.71014: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882390.71032: variable 'omit' from source: magic vars 12081 1726882390.71037: starting attempt loop 12081 1726882390.71040: running the handler 12081 1726882390.71055: _low_level_execute_command(): starting 12081 1726882390.71066: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882390.72030: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882390.72043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.72057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.72077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.72146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.72165: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882390.72175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.72190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882390.72198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882390.72207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882390.72215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.72263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.72278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.72287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.72714: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.72719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.72792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.72806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.72812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.72948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.74689: stdout chunk (state=3): >>>/root <<< 12081 1726882390.74785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882390.74878: stderr chunk (state=3): >>><<< 12081 1726882390.74890: stdout chunk (state=3): >>><<< 12081 1726882390.74975: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882390.74978: _low_level_execute_command(): starting 12081 1726882390.74980: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363 `" && echo ansible-tmp-1726882390.7491915-12542-6791407983363="` echo /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363 `" ) && sleep 0' 12081 1726882390.75587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882390.75604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.75620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.75646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.75691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.75702: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882390.75715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.75731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882390.75748: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882390.75758: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882390.75772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882390.75784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882390.75798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882390.75808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882390.75817: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882390.75828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882390.75902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882390.75924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882390.75940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882390.76088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882390.77997: stdout chunk (state=3): >>>ansible-tmp-1726882390.7491915-12542-6791407983363=/root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363 <<< 12081 1726882390.78095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882390.78190: stderr chunk (state=3): >>><<< 12081 1726882390.78200: stdout chunk (state=3): >>><<< 12081 1726882390.78373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882390.7491915-12542-6791407983363=/root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882390.78376: variable 'ansible_module_compression' from source: unknown 12081 1726882390.78379: ANSIBALLZ: Using lock for service_facts 12081 1726882390.78381: ANSIBALLZ: Acquiring lock 12081 1726882390.78383: ANSIBALLZ: Lock acquired: 139893494149200 12081 1726882390.78385: ANSIBALLZ: Creating module 12081 1726882390.99725: ANSIBALLZ: Writing module into payload 12081 1726882390.99848: ANSIBALLZ: Writing module 12081 1726882390.99878: ANSIBALLZ: Renaming module 12081 1726882390.99884: ANSIBALLZ: Done creating module 12081 1726882390.99900: variable 'ansible_facts' from source: unknown 12081 1726882390.99976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363/AnsiballZ_service_facts.py 12081 1726882391.00128: Sending initial data 12081 1726882391.00133: Sent initial data (160 bytes) 12081 1726882391.01120: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882391.01131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882391.01141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882391.01156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882391.01194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882391.01201: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882391.01211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882391.01224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882391.01233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882391.01242: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882391.01250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882391.01262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882391.01275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882391.01283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882391.01290: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882391.01298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882391.01372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882391.01387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882391.01390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882391.01590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882391.03469: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882391.03567: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882391.03678: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp2lgdfge7 /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363/AnsiballZ_service_facts.py <<< 12081 1726882391.03773: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882391.05346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882391.05435: stderr chunk (state=3): >>><<< 12081 1726882391.05439: stdout chunk (state=3): >>><<< 12081 1726882391.05461: done transferring module to remote 12081 1726882391.05474: _low_level_execute_command(): starting 12081 1726882391.05480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363/ /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363/AnsiballZ_service_facts.py && sleep 0' 12081 1726882391.07232: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882391.07237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882391.07275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882391.07288: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882391.07297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882391.07311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882391.07318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882391.07324: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882391.07333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882391.07342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882391.07356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882391.07362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882391.07371: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882391.07380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882391.07454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882391.07472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882391.07484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882391.07614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882391.09474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882391.09489: stdout chunk (state=3): >>><<< 12081 1726882391.09492: stderr chunk (state=3): >>><<< 12081 1726882391.09499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882391.09501: _low_level_execute_command(): starting 12081 1726882391.09507: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363/AnsiballZ_service_facts.py && sleep 0' 12081 1726882391.11147: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882391.11987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882391.11998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882391.12013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882391.12056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882391.12062: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882391.12075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882391.12090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882391.12098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882391.12105: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882391.12113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882391.12122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882391.12136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882391.12143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882391.12148: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882391.12157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882391.12233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882391.12256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882391.12266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882391.12408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882392.46362: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 12081 1726882392.46388: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 12081 1726882392.46403: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.s<<< 12081 1726882392.46420: stdout chunk (state=3): >>>ervice", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12081 1726882392.47754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882392.47758: stdout chunk (state=3): >>><<< 12081 1726882392.47766: stderr chunk (state=3): >>><<< 12081 1726882392.47791: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882392.48441: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882392.48454: _low_level_execute_command(): starting 12081 1726882392.48457: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882390.7491915-12542-6791407983363/ > /dev/null 2>&1 && sleep 0' 12081 1726882392.50315: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882392.50319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882392.50361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882392.50368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882392.50459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882392.50469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882392.50485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882392.50490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882392.50568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882392.50573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882392.50680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882392.50891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882392.52670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882392.52699: stderr chunk (state=3): >>><<< 12081 1726882392.52702: stdout chunk (state=3): >>><<< 12081 1726882392.53071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882392.53074: handler run complete 12081 1726882392.53077: variable 'ansible_facts' from source: unknown 12081 1726882392.53079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882392.53555: variable 'ansible_facts' from source: unknown 12081 1726882392.53675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882392.53885: attempt loop complete, returning result 12081 1726882392.53897: _execute() done 12081 1726882392.53907: dumping result to json 12081 1726882392.53973: done dumping result, returning 12081 1726882392.53988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-0a3f-ff3c-0000000002d9] 12081 1726882392.53999: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d9 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882392.55223: no more pending results, returning what we have 12081 1726882392.55226: results queue empty 12081 1726882392.55227: checking for any_errors_fatal 12081 1726882392.55232: done checking for any_errors_fatal 12081 1726882392.55233: checking for max_fail_percentage 12081 1726882392.55234: done checking for max_fail_percentage 12081 1726882392.55236: checking to see if all hosts have failed and the running result is not ok 12081 1726882392.55237: done checking to see if all hosts have failed 12081 1726882392.55238: getting the remaining hosts for this loop 12081 1726882392.55240: done getting the remaining hosts for this loop 12081 1726882392.55244: getting the next task for host managed_node3 12081 1726882392.55254: done getting next task for host managed_node3 12081 1726882392.55259: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882392.55267: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882392.55276: getting variables 12081 1726882392.55278: in VariableManager get_vars() 12081 1726882392.55311: Calling all_inventory to load vars for managed_node3 12081 1726882392.55314: Calling groups_inventory to load vars for managed_node3 12081 1726882392.55317: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882392.55329: Calling all_plugins_play to load vars for managed_node3 12081 1726882392.55331: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882392.55339: Calling groups_plugins_play to load vars for managed_node3 12081 1726882392.55726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882392.56477: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002d9 12081 1726882392.56481: WORKER PROCESS EXITING 12081 1726882392.56948: done with get_vars() 12081 1726882392.56969: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:12 -0400 (0:00:01.882) 0:00:12.373 ****** 12081 1726882392.57091: entering _queue_task() for managed_node3/package_facts 12081 1726882392.57093: Creating lock for package_facts 12081 1726882392.57510: worker is 1 (out of 1 available) 12081 1726882392.57521: exiting _queue_task() for managed_node3/package_facts 12081 1726882392.57536: done queuing things up, now waiting for results queue to drain 12081 1726882392.57537: waiting for pending results... 12081 1726882392.57969: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882392.58139: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000002da 12081 1726882392.58168: variable 'ansible_search_path' from source: unknown 12081 1726882392.58177: variable 'ansible_search_path' from source: unknown 12081 1726882392.58220: calling self._execute() 12081 1726882392.58318: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882392.58328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882392.58346: variable 'omit' from source: magic vars 12081 1726882392.58730: variable 'ansible_distribution_major_version' from source: facts 12081 1726882392.58748: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882392.58765: variable 'omit' from source: magic vars 12081 1726882392.58865: variable 'omit' from source: magic vars 12081 1726882392.59032: variable 'omit' from source: magic vars 12081 1726882392.59079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882392.59154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882392.59240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882392.59346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882392.59368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882392.59400: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882392.59407: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882392.59414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882392.59636: Set connection var ansible_pipelining to False 12081 1726882392.59645: Set connection var ansible_shell_type to sh 12081 1726882392.59671: Set connection var ansible_shell_executable to /bin/sh 12081 1726882392.59766: Set connection var ansible_connection to ssh 12081 1726882392.59780: Set connection var ansible_timeout to 10 12081 1726882392.59789: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882392.59815: variable 'ansible_shell_executable' from source: unknown 12081 1726882392.59822: variable 'ansible_connection' from source: unknown 12081 1726882392.59828: variable 'ansible_module_compression' from source: unknown 12081 1726882392.59835: variable 'ansible_shell_type' from source: unknown 12081 1726882392.59841: variable 'ansible_shell_executable' from source: unknown 12081 1726882392.59848: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882392.59859: variable 'ansible_pipelining' from source: unknown 12081 1726882392.59870: variable 'ansible_timeout' from source: unknown 12081 1726882392.59882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882392.60298: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882392.60369: variable 'omit' from source: magic vars 12081 1726882392.60380: starting attempt loop 12081 1726882392.60387: running the handler 12081 1726882392.60406: _low_level_execute_command(): starting 12081 1726882392.60436: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882392.61242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882392.61259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882392.61276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882392.61297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882392.61343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882392.61358: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882392.61373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882392.61392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882392.61405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882392.61415: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882392.61430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882392.61444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882392.61465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882392.61478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882392.61489: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882392.61503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882392.61588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882392.61604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882392.61621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882392.61748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882392.63359: stdout chunk (state=3): >>>/root <<< 12081 1726882392.63480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882392.63543: stderr chunk (state=3): >>><<< 12081 1726882392.63547: stdout chunk (state=3): >>><<< 12081 1726882392.63575: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882392.63588: _low_level_execute_command(): starting 12081 1726882392.63595: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770 `" && echo ansible-tmp-1726882392.6357443-12623-214896088252770="` echo /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770 `" ) && sleep 0' 12081 1726882392.64414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882392.64419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882392.64472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882392.64478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882392.64492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882392.64495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882392.64508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882392.64513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882392.64595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882392.64609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882392.64614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882392.64745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882392.66604: stdout chunk (state=3): >>>ansible-tmp-1726882392.6357443-12623-214896088252770=/root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770 <<< 12081 1726882392.66809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882392.66812: stdout chunk (state=3): >>><<< 12081 1726882392.66815: stderr chunk (state=3): >>><<< 12081 1726882392.67074: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882392.6357443-12623-214896088252770=/root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882392.67077: variable 'ansible_module_compression' from source: unknown 12081 1726882392.67080: ANSIBALLZ: Using lock for package_facts 12081 1726882392.67082: ANSIBALLZ: Acquiring lock 12081 1726882392.67084: ANSIBALLZ: Lock acquired: 139893494140512 12081 1726882392.67085: ANSIBALLZ: Creating module 12081 1726882393.03727: ANSIBALLZ: Writing module into payload 12081 1726882393.04166: ANSIBALLZ: Writing module 12081 1726882393.04196: ANSIBALLZ: Renaming module 12081 1726882393.04199: ANSIBALLZ: Done creating module 12081 1726882393.04234: variable 'ansible_facts' from source: unknown 12081 1726882393.04727: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770/AnsiballZ_package_facts.py 12081 1726882393.04992: Sending initial data 12081 1726882393.05002: Sent initial data (162 bytes) 12081 1726882393.05719: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882393.05726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882393.05760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.05767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882393.05786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882393.05790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.05840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882393.05843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882393.05861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882393.05983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882393.07883: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882393.07978: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882393.08080: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpmvxgh2qj /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770/AnsiballZ_package_facts.py <<< 12081 1726882393.08177: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882393.10543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882393.10653: stderr chunk (state=3): >>><<< 12081 1726882393.10658: stdout chunk (state=3): >>><<< 12081 1726882393.10721: done transferring module to remote 12081 1726882393.10726: _low_level_execute_command(): starting 12081 1726882393.10733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770/ /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770/AnsiballZ_package_facts.py && sleep 0' 12081 1726882393.11210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882393.11218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882393.11253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882393.11265: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882393.11276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.11282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882393.11287: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882393.11294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882393.11299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882393.11308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882393.11316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882393.11320: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882393.11325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.11384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882393.11405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882393.11408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882393.11512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882393.13605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882393.13609: stdout chunk (state=3): >>><<< 12081 1726882393.13612: stderr chunk (state=3): >>><<< 12081 1726882393.13615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882393.13617: _low_level_execute_command(): starting 12081 1726882393.13619: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770/AnsiballZ_package_facts.py && sleep 0' 12081 1726882393.13973: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882393.13988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882393.14003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882393.14021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882393.14060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882393.14075: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882393.14089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.14106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882393.14118: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882393.14129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882393.14140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882393.14154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882393.14172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882393.14185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882393.14196: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882393.14210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.14286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882393.14304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882393.14319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882393.14460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882393.61688: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 12081 1726882393.61710: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 12081 1726882393.61737: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 12081 1726882393.61742: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 12081 1726882393.61753: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 12081 1726882393.61756: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 12081 1726882393.61766: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 12081 1726882393.61784: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 12081 1726882393.61811: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 12081 1726882393.61815: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 12081 1726882393.61823: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 12081 1726882393.61836: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 12081 1726882393.61844: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 12081 1726882393.61865: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 12081 1726882393.61884: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 12081 1726882393.61901: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 12081 1726882393.61907: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12081 1726882393.63398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882393.63453: stderr chunk (state=3): >>><<< 12081 1726882393.63456: stdout chunk (state=3): >>><<< 12081 1726882393.63497: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882393.65793: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882393.65815: _low_level_execute_command(): starting 12081 1726882393.65818: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882392.6357443-12623-214896088252770/ > /dev/null 2>&1 && sleep 0' 12081 1726882393.66242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882393.66247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882393.66282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.66294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882393.66345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882393.66360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882393.66466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882393.68359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882393.68417: stderr chunk (state=3): >>><<< 12081 1726882393.68420: stdout chunk (state=3): >>><<< 12081 1726882393.68434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882393.68441: handler run complete 12081 1726882393.69374: variable 'ansible_facts' from source: unknown 12081 1726882393.69990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882393.72164: variable 'ansible_facts' from source: unknown 12081 1726882393.72968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882393.73526: attempt loop complete, returning result 12081 1726882393.73537: _execute() done 12081 1726882393.73540: dumping result to json 12081 1726882393.73756: done dumping result, returning 12081 1726882393.73765: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-0a3f-ff3c-0000000002da] 12081 1726882393.73772: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002da 12081 1726882393.76323: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000002da 12081 1726882393.76326: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882393.76432: no more pending results, returning what we have 12081 1726882393.76435: results queue empty 12081 1726882393.76436: checking for any_errors_fatal 12081 1726882393.76440: done checking for any_errors_fatal 12081 1726882393.76440: checking for max_fail_percentage 12081 1726882393.76442: done checking for max_fail_percentage 12081 1726882393.76443: checking to see if all hosts have failed and the running result is not ok 12081 1726882393.76444: done checking to see if all hosts have failed 12081 1726882393.76445: getting the remaining hosts for this loop 12081 1726882393.76446: done getting the remaining hosts for this loop 12081 1726882393.76450: getting the next task for host managed_node3 12081 1726882393.76460: done getting next task for host managed_node3 12081 1726882393.76464: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882393.76471: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882393.76481: getting variables 12081 1726882393.76482: in VariableManager get_vars() 12081 1726882393.76508: Calling all_inventory to load vars for managed_node3 12081 1726882393.76511: Calling groups_inventory to load vars for managed_node3 12081 1726882393.76514: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882393.76523: Calling all_plugins_play to load vars for managed_node3 12081 1726882393.76526: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882393.76529: Calling groups_plugins_play to load vars for managed_node3 12081 1726882393.77931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882393.79887: done with get_vars() 12081 1726882393.79910: done getting variables 12081 1726882393.79977: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:13 -0400 (0:00:01.229) 0:00:13.603 ****** 12081 1726882393.80016: entering _queue_task() for managed_node3/debug 12081 1726882393.80316: worker is 1 (out of 1 available) 12081 1726882393.80328: exiting _queue_task() for managed_node3/debug 12081 1726882393.80343: done queuing things up, now waiting for results queue to drain 12081 1726882393.80344: waiting for pending results... 12081 1726882393.81056: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882393.81188: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000278 12081 1726882393.81207: variable 'ansible_search_path' from source: unknown 12081 1726882393.81213: variable 'ansible_search_path' from source: unknown 12081 1726882393.81250: calling self._execute() 12081 1726882393.81336: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882393.81347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882393.81360: variable 'omit' from source: magic vars 12081 1726882393.81690: variable 'ansible_distribution_major_version' from source: facts 12081 1726882393.81707: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882393.81717: variable 'omit' from source: magic vars 12081 1726882393.81780: variable 'omit' from source: magic vars 12081 1726882393.81874: variable 'network_provider' from source: set_fact 12081 1726882393.81898: variable 'omit' from source: magic vars 12081 1726882393.81942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882393.81979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882393.82002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882393.82024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882393.82039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882393.82076: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882393.82083: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882393.82090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882393.82191: Set connection var ansible_pipelining to False 12081 1726882393.82199: Set connection var ansible_shell_type to sh 12081 1726882393.82210: Set connection var ansible_shell_executable to /bin/sh 12081 1726882393.82216: Set connection var ansible_connection to ssh 12081 1726882393.82227: Set connection var ansible_timeout to 10 12081 1726882393.82235: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882393.82265: variable 'ansible_shell_executable' from source: unknown 12081 1726882393.82274: variable 'ansible_connection' from source: unknown 12081 1726882393.82281: variable 'ansible_module_compression' from source: unknown 12081 1726882393.82287: variable 'ansible_shell_type' from source: unknown 12081 1726882393.82292: variable 'ansible_shell_executable' from source: unknown 12081 1726882393.82298: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882393.82304: variable 'ansible_pipelining' from source: unknown 12081 1726882393.82310: variable 'ansible_timeout' from source: unknown 12081 1726882393.82317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882393.82449: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882393.82466: variable 'omit' from source: magic vars 12081 1726882393.82476: starting attempt loop 12081 1726882393.82481: running the handler 12081 1726882393.82528: handler run complete 12081 1726882393.82547: attempt loop complete, returning result 12081 1726882393.82557: _execute() done 12081 1726882393.82566: dumping result to json 12081 1726882393.82573: done dumping result, returning 12081 1726882393.82584: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-0a3f-ff3c-000000000278] 12081 1726882393.82594: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000278 ok: [managed_node3] => {} MSG: Using network provider: nm 12081 1726882393.82748: no more pending results, returning what we have 12081 1726882393.82754: results queue empty 12081 1726882393.82755: checking for any_errors_fatal 12081 1726882393.82765: done checking for any_errors_fatal 12081 1726882393.82766: checking for max_fail_percentage 12081 1726882393.82768: done checking for max_fail_percentage 12081 1726882393.82769: checking to see if all hosts have failed and the running result is not ok 12081 1726882393.82770: done checking to see if all hosts have failed 12081 1726882393.82771: getting the remaining hosts for this loop 12081 1726882393.82772: done getting the remaining hosts for this loop 12081 1726882393.82776: getting the next task for host managed_node3 12081 1726882393.82783: done getting next task for host managed_node3 12081 1726882393.82787: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882393.82792: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882393.82807: getting variables 12081 1726882393.82809: in VariableManager get_vars() 12081 1726882393.82842: Calling all_inventory to load vars for managed_node3 12081 1726882393.82845: Calling groups_inventory to load vars for managed_node3 12081 1726882393.82847: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882393.82860: Calling all_plugins_play to load vars for managed_node3 12081 1726882393.82864: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882393.82868: Calling groups_plugins_play to load vars for managed_node3 12081 1726882393.84072: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000278 12081 1726882393.84076: WORKER PROCESS EXITING 12081 1726882393.84921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882393.88895: done with get_vars() 12081 1726882393.88923: done getting variables 12081 1726882393.89017: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:13 -0400 (0:00:00.090) 0:00:13.693 ****** 12081 1726882393.89058: entering _queue_task() for managed_node3/fail 12081 1726882393.89060: Creating lock for fail 12081 1726882393.89627: worker is 1 (out of 1 available) 12081 1726882393.89645: exiting _queue_task() for managed_node3/fail 12081 1726882393.89661: done queuing things up, now waiting for results queue to drain 12081 1726882393.89662: waiting for pending results... 12081 1726882393.90410: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882393.90686: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000279 12081 1726882393.90821: variable 'ansible_search_path' from source: unknown 12081 1726882393.90831: variable 'ansible_search_path' from source: unknown 12081 1726882393.90873: calling self._execute() 12081 1726882393.90962: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882393.91041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882393.91056: variable 'omit' from source: magic vars 12081 1726882393.91841: variable 'ansible_distribution_major_version' from source: facts 12081 1726882393.91858: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882393.91979: variable 'network_state' from source: role '' defaults 12081 1726882393.92135: Evaluated conditional (network_state != {}): False 12081 1726882393.92143: when evaluation is False, skipping this task 12081 1726882393.92149: _execute() done 12081 1726882393.92156: dumping result to json 12081 1726882393.92163: done dumping result, returning 12081 1726882393.92176: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-0a3f-ff3c-000000000279] 12081 1726882393.92188: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000279 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882393.92339: no more pending results, returning what we have 12081 1726882393.92344: results queue empty 12081 1726882393.92345: checking for any_errors_fatal 12081 1726882393.92354: done checking for any_errors_fatal 12081 1726882393.92355: checking for max_fail_percentage 12081 1726882393.92357: done checking for max_fail_percentage 12081 1726882393.92358: checking to see if all hosts have failed and the running result is not ok 12081 1726882393.92359: done checking to see if all hosts have failed 12081 1726882393.92360: getting the remaining hosts for this loop 12081 1726882393.92362: done getting the remaining hosts for this loop 12081 1726882393.92368: getting the next task for host managed_node3 12081 1726882393.92377: done getting next task for host managed_node3 12081 1726882393.92382: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882393.92388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882393.92407: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000279 12081 1726882393.92415: WORKER PROCESS EXITING 12081 1726882393.92427: getting variables 12081 1726882393.92430: in VariableManager get_vars() 12081 1726882393.92467: Calling all_inventory to load vars for managed_node3 12081 1726882393.92470: Calling groups_inventory to load vars for managed_node3 12081 1726882393.92472: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882393.92484: Calling all_plugins_play to load vars for managed_node3 12081 1726882393.92487: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882393.92489: Calling groups_plugins_play to load vars for managed_node3 12081 1726882393.94684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882393.98559: done with get_vars() 12081 1726882393.98587: done getting variables 12081 1726882393.98644: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:13 -0400 (0:00:00.096) 0:00:13.789 ****** 12081 1726882393.98684: entering _queue_task() for managed_node3/fail 12081 1726882393.99474: worker is 1 (out of 1 available) 12081 1726882393.99485: exiting _queue_task() for managed_node3/fail 12081 1726882393.99498: done queuing things up, now waiting for results queue to drain 12081 1726882393.99499: waiting for pending results... 12081 1726882393.99786: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882393.99916: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000027a 12081 1726882393.99929: variable 'ansible_search_path' from source: unknown 12081 1726882393.99933: variable 'ansible_search_path' from source: unknown 12081 1726882393.99982: calling self._execute() 12081 1726882394.00080: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.00084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.00094: variable 'omit' from source: magic vars 12081 1726882394.00447: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.00474: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.00588: variable 'network_state' from source: role '' defaults 12081 1726882394.00606: Evaluated conditional (network_state != {}): False 12081 1726882394.00609: when evaluation is False, skipping this task 12081 1726882394.00612: _execute() done 12081 1726882394.00615: dumping result to json 12081 1726882394.00617: done dumping result, returning 12081 1726882394.00621: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-0a3f-ff3c-00000000027a] 12081 1726882394.00628: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027a skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882394.00780: no more pending results, returning what we have 12081 1726882394.00785: results queue empty 12081 1726882394.00785: checking for any_errors_fatal 12081 1726882394.00794: done checking for any_errors_fatal 12081 1726882394.00795: checking for max_fail_percentage 12081 1726882394.00797: done checking for max_fail_percentage 12081 1726882394.00798: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.00800: done checking to see if all hosts have failed 12081 1726882394.00800: getting the remaining hosts for this loop 12081 1726882394.00803: done getting the remaining hosts for this loop 12081 1726882394.00808: getting the next task for host managed_node3 12081 1726882394.00819: done getting next task for host managed_node3 12081 1726882394.00823: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882394.00829: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.00848: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027a 12081 1726882394.00852: WORKER PROCESS EXITING 12081 1726882394.00859: getting variables 12081 1726882394.00861: in VariableManager get_vars() 12081 1726882394.00900: Calling all_inventory to load vars for managed_node3 12081 1726882394.00903: Calling groups_inventory to load vars for managed_node3 12081 1726882394.00906: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.00918: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.00921: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.00924: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.03387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.06513: done with get_vars() 12081 1726882394.06542: done getting variables 12081 1726882394.06604: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:14 -0400 (0:00:00.079) 0:00:13.869 ****** 12081 1726882394.06639: entering _queue_task() for managed_node3/fail 12081 1726882394.06937: worker is 1 (out of 1 available) 12081 1726882394.06949: exiting _queue_task() for managed_node3/fail 12081 1726882394.06964: done queuing things up, now waiting for results queue to drain 12081 1726882394.06966: waiting for pending results... 12081 1726882394.07243: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882394.07366: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000027b 12081 1726882394.07380: variable 'ansible_search_path' from source: unknown 12081 1726882394.07384: variable 'ansible_search_path' from source: unknown 12081 1726882394.07425: calling self._execute() 12081 1726882394.07508: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.07512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.07529: variable 'omit' from source: magic vars 12081 1726882394.08149: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.08165: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.08347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882394.11630: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882394.11716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882394.11761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882394.11804: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882394.11836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882394.11923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.11956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.11994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.12039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.12058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.12169: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.12250: Evaluated conditional (ansible_distribution_major_version | int > 9): False 12081 1726882394.12258: when evaluation is False, skipping this task 12081 1726882394.12268: _execute() done 12081 1726882394.12275: dumping result to json 12081 1726882394.12282: done dumping result, returning 12081 1726882394.12298: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-0a3f-ff3c-00000000027b] 12081 1726882394.12307: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027b skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 12081 1726882394.12543: no more pending results, returning what we have 12081 1726882394.12547: results queue empty 12081 1726882394.12548: checking for any_errors_fatal 12081 1726882394.12558: done checking for any_errors_fatal 12081 1726882394.12559: checking for max_fail_percentage 12081 1726882394.12560: done checking for max_fail_percentage 12081 1726882394.12561: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.12562: done checking to see if all hosts have failed 12081 1726882394.12565: getting the remaining hosts for this loop 12081 1726882394.12567: done getting the remaining hosts for this loop 12081 1726882394.12571: getting the next task for host managed_node3 12081 1726882394.12579: done getting next task for host managed_node3 12081 1726882394.12583: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882394.12588: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.12605: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027b 12081 1726882394.12610: WORKER PROCESS EXITING 12081 1726882394.12616: getting variables 12081 1726882394.12618: in VariableManager get_vars() 12081 1726882394.12662: Calling all_inventory to load vars for managed_node3 12081 1726882394.12669: Calling groups_inventory to load vars for managed_node3 12081 1726882394.12671: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.12683: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.12686: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.12689: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.14554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.16377: done with get_vars() 12081 1726882394.16404: done getting variables 12081 1726882394.16512: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:14 -0400 (0:00:00.099) 0:00:13.968 ****** 12081 1726882394.16553: entering _queue_task() for managed_node3/dnf 12081 1726882394.16882: worker is 1 (out of 1 available) 12081 1726882394.16894: exiting _queue_task() for managed_node3/dnf 12081 1726882394.16910: done queuing things up, now waiting for results queue to drain 12081 1726882394.16912: waiting for pending results... 12081 1726882394.17253: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882394.17535: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000027c 12081 1726882394.17721: variable 'ansible_search_path' from source: unknown 12081 1726882394.17730: variable 'ansible_search_path' from source: unknown 12081 1726882394.17783: calling self._execute() 12081 1726882394.17888: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.17900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.17922: variable 'omit' from source: magic vars 12081 1726882394.18776: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.18802: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.19305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882394.24325: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882394.24524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882394.24616: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882394.24812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882394.24842: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882394.25012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.25044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.25098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.25266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.25286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.25502: variable 'ansible_distribution' from source: facts 12081 1726882394.25598: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.25619: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12081 1726882394.25750: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882394.25881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.25905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.25930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.25979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.26008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.26058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.26092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.26126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.26289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.26309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.26362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.26439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.26476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.26522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.26549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.26733: variable 'network_connections' from source: include params 12081 1726882394.26760: variable 'controller_profile' from source: play vars 12081 1726882394.26832: variable 'controller_profile' from source: play vars 12081 1726882394.26846: variable 'controller_device' from source: play vars 12081 1726882394.26921: variable 'controller_device' from source: play vars 12081 1726882394.26940: variable 'port1_profile' from source: play vars 12081 1726882394.27012: variable 'port1_profile' from source: play vars 12081 1726882394.27024: variable 'dhcp_interface1' from source: play vars 12081 1726882394.27101: variable 'dhcp_interface1' from source: play vars 12081 1726882394.27113: variable 'controller_profile' from source: play vars 12081 1726882394.27178: variable 'controller_profile' from source: play vars 12081 1726882394.27197: variable 'port2_profile' from source: play vars 12081 1726882394.27261: variable 'port2_profile' from source: play vars 12081 1726882394.27275: variable 'dhcp_interface2' from source: play vars 12081 1726882394.27344: variable 'dhcp_interface2' from source: play vars 12081 1726882394.27359: variable 'controller_profile' from source: play vars 12081 1726882394.27432: variable 'controller_profile' from source: play vars 12081 1726882394.27526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882394.27728: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882394.27784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882394.27819: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882394.27866: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882394.27916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882394.27943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882394.27987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.28018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882394.28094: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882394.28406: variable 'network_connections' from source: include params 12081 1726882394.28417: variable 'controller_profile' from source: play vars 12081 1726882394.28493: variable 'controller_profile' from source: play vars 12081 1726882394.28511: variable 'controller_device' from source: play vars 12081 1726882394.28578: variable 'controller_device' from source: play vars 12081 1726882394.28594: variable 'port1_profile' from source: play vars 12081 1726882394.28669: variable 'port1_profile' from source: play vars 12081 1726882394.28681: variable 'dhcp_interface1' from source: play vars 12081 1726882394.28753: variable 'dhcp_interface1' from source: play vars 12081 1726882394.28767: variable 'controller_profile' from source: play vars 12081 1726882394.28837: variable 'controller_profile' from source: play vars 12081 1726882394.28849: variable 'port2_profile' from source: play vars 12081 1726882394.28910: variable 'port2_profile' from source: play vars 12081 1726882394.28920: variable 'dhcp_interface2' from source: play vars 12081 1726882394.28983: variable 'dhcp_interface2' from source: play vars 12081 1726882394.28992: variable 'controller_profile' from source: play vars 12081 1726882394.29053: variable 'controller_profile' from source: play vars 12081 1726882394.29094: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882394.29101: when evaluation is False, skipping this task 12081 1726882394.29106: _execute() done 12081 1726882394.29111: dumping result to json 12081 1726882394.29117: done dumping result, returning 12081 1726882394.29126: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-00000000027c] 12081 1726882394.29135: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027c 12081 1726882394.29253: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027c 12081 1726882394.29266: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882394.29316: no more pending results, returning what we have 12081 1726882394.29321: results queue empty 12081 1726882394.29322: checking for any_errors_fatal 12081 1726882394.29329: done checking for any_errors_fatal 12081 1726882394.29330: checking for max_fail_percentage 12081 1726882394.29332: done checking for max_fail_percentage 12081 1726882394.29333: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.29334: done checking to see if all hosts have failed 12081 1726882394.29335: getting the remaining hosts for this loop 12081 1726882394.29337: done getting the remaining hosts for this loop 12081 1726882394.29341: getting the next task for host managed_node3 12081 1726882394.29350: done getting next task for host managed_node3 12081 1726882394.29356: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882394.29362: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.29377: getting variables 12081 1726882394.29379: in VariableManager get_vars() 12081 1726882394.29415: Calling all_inventory to load vars for managed_node3 12081 1726882394.29417: Calling groups_inventory to load vars for managed_node3 12081 1726882394.29420: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.29431: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.29434: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.29436: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.31327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.33094: done with get_vars() 12081 1726882394.33124: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882394.33217: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:14 -0400 (0:00:00.166) 0:00:14.135 ****** 12081 1726882394.33252: entering _queue_task() for managed_node3/yum 12081 1726882394.33255: Creating lock for yum 12081 1726882394.33609: worker is 1 (out of 1 available) 12081 1726882394.33620: exiting _queue_task() for managed_node3/yum 12081 1726882394.33633: done queuing things up, now waiting for results queue to drain 12081 1726882394.33635: waiting for pending results... 12081 1726882394.33936: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882394.34088: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000027d 12081 1726882394.34107: variable 'ansible_search_path' from source: unknown 12081 1726882394.34115: variable 'ansible_search_path' from source: unknown 12081 1726882394.34166: calling self._execute() 12081 1726882394.34267: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.34278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.34294: variable 'omit' from source: magic vars 12081 1726882394.34680: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.34699: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.34902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882394.37486: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882394.37577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882394.37622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882394.37668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882394.37698: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882394.37788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.37823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.37855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.37911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.37933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.38045: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.38071: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12081 1726882394.38078: when evaluation is False, skipping this task 12081 1726882394.38085: _execute() done 12081 1726882394.38093: dumping result to json 12081 1726882394.38102: done dumping result, returning 12081 1726882394.38117: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-00000000027d] 12081 1726882394.38133: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12081 1726882394.38313: no more pending results, returning what we have 12081 1726882394.38318: results queue empty 12081 1726882394.38319: checking for any_errors_fatal 12081 1726882394.38325: done checking for any_errors_fatal 12081 1726882394.38326: checking for max_fail_percentage 12081 1726882394.38328: done checking for max_fail_percentage 12081 1726882394.38329: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.38331: done checking to see if all hosts have failed 12081 1726882394.38331: getting the remaining hosts for this loop 12081 1726882394.38333: done getting the remaining hosts for this loop 12081 1726882394.38338: getting the next task for host managed_node3 12081 1726882394.38346: done getting next task for host managed_node3 12081 1726882394.38353: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882394.38359: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.38378: getting variables 12081 1726882394.38380: in VariableManager get_vars() 12081 1726882394.38420: Calling all_inventory to load vars for managed_node3 12081 1726882394.38423: Calling groups_inventory to load vars for managed_node3 12081 1726882394.38426: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.38438: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.38442: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.38446: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.39443: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027d 12081 1726882394.39447: WORKER PROCESS EXITING 12081 1726882394.40311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.42350: done with get_vars() 12081 1726882394.42390: done getting variables 12081 1726882394.42454: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:14 -0400 (0:00:00.092) 0:00:14.227 ****** 12081 1726882394.42501: entering _queue_task() for managed_node3/fail 12081 1726882394.42841: worker is 1 (out of 1 available) 12081 1726882394.42858: exiting _queue_task() for managed_node3/fail 12081 1726882394.42876: done queuing things up, now waiting for results queue to drain 12081 1726882394.42878: waiting for pending results... 12081 1726882394.43188: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882394.43331: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000027e 12081 1726882394.43342: variable 'ansible_search_path' from source: unknown 12081 1726882394.43346: variable 'ansible_search_path' from source: unknown 12081 1726882394.43401: calling self._execute() 12081 1726882394.43465: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.43470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.43481: variable 'omit' from source: magic vars 12081 1726882394.44374: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.44378: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.44381: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882394.44384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882394.47405: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882394.47493: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882394.47533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882394.47576: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882394.47605: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882394.47691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.47723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.47758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.47805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.47822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.47874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.47902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.47931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.47979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.47999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.48041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.48072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.48100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.48143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.48166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.48345: variable 'network_connections' from source: include params 12081 1726882394.48368: variable 'controller_profile' from source: play vars 12081 1726882394.48440: variable 'controller_profile' from source: play vars 12081 1726882394.48458: variable 'controller_device' from source: play vars 12081 1726882394.48522: variable 'controller_device' from source: play vars 12081 1726882394.48542: variable 'port1_profile' from source: play vars 12081 1726882394.48607: variable 'port1_profile' from source: play vars 12081 1726882394.48618: variable 'dhcp_interface1' from source: play vars 12081 1726882394.48675: variable 'dhcp_interface1' from source: play vars 12081 1726882394.48684: variable 'controller_profile' from source: play vars 12081 1726882394.48736: variable 'controller_profile' from source: play vars 12081 1726882394.48747: variable 'port2_profile' from source: play vars 12081 1726882394.48806: variable 'port2_profile' from source: play vars 12081 1726882394.48816: variable 'dhcp_interface2' from source: play vars 12081 1726882394.48880: variable 'dhcp_interface2' from source: play vars 12081 1726882394.48893: variable 'controller_profile' from source: play vars 12081 1726882394.48959: variable 'controller_profile' from source: play vars 12081 1726882394.49036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882394.49374: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882394.49416: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882394.49450: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882394.49491: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882394.49535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882394.49568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882394.49600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.49630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882394.49707: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882394.50029: variable 'network_connections' from source: include params 12081 1726882394.50040: variable 'controller_profile' from source: play vars 12081 1726882394.50109: variable 'controller_profile' from source: play vars 12081 1726882394.50122: variable 'controller_device' from source: play vars 12081 1726882394.50190: variable 'controller_device' from source: play vars 12081 1726882394.50284: variable 'port1_profile' from source: play vars 12081 1726882394.50344: variable 'port1_profile' from source: play vars 12081 1726882394.50399: variable 'dhcp_interface1' from source: play vars 12081 1726882394.50525: variable 'dhcp_interface1' from source: play vars 12081 1726882394.50543: variable 'controller_profile' from source: play vars 12081 1726882394.50729: variable 'controller_profile' from source: play vars 12081 1726882394.50740: variable 'port2_profile' from source: play vars 12081 1726882394.50807: variable 'port2_profile' from source: play vars 12081 1726882394.50879: variable 'dhcp_interface2' from source: play vars 12081 1726882394.50940: variable 'dhcp_interface2' from source: play vars 12081 1726882394.51122: variable 'controller_profile' from source: play vars 12081 1726882394.51190: variable 'controller_profile' from source: play vars 12081 1726882394.51232: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882394.51239: when evaluation is False, skipping this task 12081 1726882394.51245: _execute() done 12081 1726882394.51255: dumping result to json 12081 1726882394.51262: done dumping result, returning 12081 1726882394.51276: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-00000000027e] 12081 1726882394.51285: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027e skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882394.51444: no more pending results, returning what we have 12081 1726882394.51449: results queue empty 12081 1726882394.51449: checking for any_errors_fatal 12081 1726882394.51456: done checking for any_errors_fatal 12081 1726882394.51457: checking for max_fail_percentage 12081 1726882394.51458: done checking for max_fail_percentage 12081 1726882394.51459: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.51461: done checking to see if all hosts have failed 12081 1726882394.51462: getting the remaining hosts for this loop 12081 1726882394.51465: done getting the remaining hosts for this loop 12081 1726882394.51469: getting the next task for host managed_node3 12081 1726882394.51478: done getting next task for host managed_node3 12081 1726882394.51481: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12081 1726882394.51486: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.51505: getting variables 12081 1726882394.51507: in VariableManager get_vars() 12081 1726882394.51540: Calling all_inventory to load vars for managed_node3 12081 1726882394.51543: Calling groups_inventory to load vars for managed_node3 12081 1726882394.51545: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.51557: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.51559: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.51563: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.52082: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027e 12081 1726882394.52086: WORKER PROCESS EXITING 12081 1726882394.53425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.54874: done with get_vars() 12081 1726882394.54896: done getting variables 12081 1726882394.54956: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:14 -0400 (0:00:00.124) 0:00:14.352 ****** 12081 1726882394.54983: entering _queue_task() for managed_node3/package 12081 1726882394.55313: worker is 1 (out of 1 available) 12081 1726882394.55328: exiting _queue_task() for managed_node3/package 12081 1726882394.55342: done queuing things up, now waiting for results queue to drain 12081 1726882394.55344: waiting for pending results... 12081 1726882394.55782: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12081 1726882394.55789: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000027f 12081 1726882394.55889: variable 'ansible_search_path' from source: unknown 12081 1726882394.55895: variable 'ansible_search_path' from source: unknown 12081 1726882394.55900: calling self._execute() 12081 1726882394.55933: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.55937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.55945: variable 'omit' from source: magic vars 12081 1726882394.56689: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.56715: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.56936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882394.57252: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882394.57306: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882394.57343: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882394.57392: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882394.57518: variable 'network_packages' from source: role '' defaults 12081 1726882394.57717: variable '__network_provider_setup' from source: role '' defaults 12081 1726882394.57753: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882394.57858: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882394.57895: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882394.57999: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882394.58213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882394.59766: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882394.59810: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882394.59836: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882394.60354: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882394.60358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882394.60404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.60432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.60461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.60504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.60517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.60564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.60586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.60608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.60644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.60661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.60889: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882394.61069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.61208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.61233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.61278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.61286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.61500: variable 'ansible_python' from source: facts 12081 1726882394.61520: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882394.61728: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882394.61890: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882394.62026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.62049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.62101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.62128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.62143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.62196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.62220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.62245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.62291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.62316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.62457: variable 'network_connections' from source: include params 12081 1726882394.62460: variable 'controller_profile' from source: play vars 12081 1726882394.62568: variable 'controller_profile' from source: play vars 12081 1726882394.62577: variable 'controller_device' from source: play vars 12081 1726882394.62678: variable 'controller_device' from source: play vars 12081 1726882394.62694: variable 'port1_profile' from source: play vars 12081 1726882394.62800: variable 'port1_profile' from source: play vars 12081 1726882394.62808: variable 'dhcp_interface1' from source: play vars 12081 1726882394.62979: variable 'dhcp_interface1' from source: play vars 12081 1726882394.62996: variable 'controller_profile' from source: play vars 12081 1726882394.63148: variable 'controller_profile' from source: play vars 12081 1726882394.63170: variable 'port2_profile' from source: play vars 12081 1726882394.63256: variable 'port2_profile' from source: play vars 12081 1726882394.63267: variable 'dhcp_interface2' from source: play vars 12081 1726882394.63334: variable 'dhcp_interface2' from source: play vars 12081 1726882394.63341: variable 'controller_profile' from source: play vars 12081 1726882394.63422: variable 'controller_profile' from source: play vars 12081 1726882394.63489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882394.63509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882394.63529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.63549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882394.63594: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882394.63780: variable 'network_connections' from source: include params 12081 1726882394.63783: variable 'controller_profile' from source: play vars 12081 1726882394.63854: variable 'controller_profile' from source: play vars 12081 1726882394.63866: variable 'controller_device' from source: play vars 12081 1726882394.63934: variable 'controller_device' from source: play vars 12081 1726882394.63945: variable 'port1_profile' from source: play vars 12081 1726882394.64015: variable 'port1_profile' from source: play vars 12081 1726882394.64026: variable 'dhcp_interface1' from source: play vars 12081 1726882394.64093: variable 'dhcp_interface1' from source: play vars 12081 1726882394.64100: variable 'controller_profile' from source: play vars 12081 1726882394.64172: variable 'controller_profile' from source: play vars 12081 1726882394.64179: variable 'port2_profile' from source: play vars 12081 1726882394.64245: variable 'port2_profile' from source: play vars 12081 1726882394.64254: variable 'dhcp_interface2' from source: play vars 12081 1726882394.64322: variable 'dhcp_interface2' from source: play vars 12081 1726882394.64329: variable 'controller_profile' from source: play vars 12081 1726882394.64402: variable 'controller_profile' from source: play vars 12081 1726882394.64441: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882394.64502: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882394.64701: variable 'network_connections' from source: include params 12081 1726882394.64704: variable 'controller_profile' from source: play vars 12081 1726882394.64749: variable 'controller_profile' from source: play vars 12081 1726882394.64758: variable 'controller_device' from source: play vars 12081 1726882394.64806: variable 'controller_device' from source: play vars 12081 1726882394.64815: variable 'port1_profile' from source: play vars 12081 1726882394.64861: variable 'port1_profile' from source: play vars 12081 1726882394.64868: variable 'dhcp_interface1' from source: play vars 12081 1726882394.64915: variable 'dhcp_interface1' from source: play vars 12081 1726882394.64920: variable 'controller_profile' from source: play vars 12081 1726882394.64968: variable 'controller_profile' from source: play vars 12081 1726882394.65029: variable 'port2_profile' from source: play vars 12081 1726882394.65036: variable 'port2_profile' from source: play vars 12081 1726882394.65043: variable 'dhcp_interface2' from source: play vars 12081 1726882394.65110: variable 'dhcp_interface2' from source: play vars 12081 1726882394.65113: variable 'controller_profile' from source: play vars 12081 1726882394.65158: variable 'controller_profile' from source: play vars 12081 1726882394.65212: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882394.65250: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882394.65939: variable 'network_connections' from source: include params 12081 1726882394.65943: variable 'controller_profile' from source: play vars 12081 1726882394.65946: variable 'controller_profile' from source: play vars 12081 1726882394.65948: variable 'controller_device' from source: play vars 12081 1726882394.65953: variable 'controller_device' from source: play vars 12081 1726882394.65955: variable 'port1_profile' from source: play vars 12081 1726882394.65958: variable 'port1_profile' from source: play vars 12081 1726882394.65960: variable 'dhcp_interface1' from source: play vars 12081 1726882394.65962: variable 'dhcp_interface1' from source: play vars 12081 1726882394.66295: variable 'controller_profile' from source: play vars 12081 1726882394.66298: variable 'controller_profile' from source: play vars 12081 1726882394.66301: variable 'port2_profile' from source: play vars 12081 1726882394.66303: variable 'port2_profile' from source: play vars 12081 1726882394.66306: variable 'dhcp_interface2' from source: play vars 12081 1726882394.66308: variable 'dhcp_interface2' from source: play vars 12081 1726882394.66310: variable 'controller_profile' from source: play vars 12081 1726882394.66312: variable 'controller_profile' from source: play vars 12081 1726882394.66314: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882394.66316: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882394.66317: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882394.66319: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882394.66447: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882394.70416: variable 'network_connections' from source: include params 12081 1726882394.70420: variable 'controller_profile' from source: play vars 12081 1726882394.70489: variable 'controller_profile' from source: play vars 12081 1726882394.70497: variable 'controller_device' from source: play vars 12081 1726882394.70556: variable 'controller_device' from source: play vars 12081 1726882394.70565: variable 'port1_profile' from source: play vars 12081 1726882394.70621: variable 'port1_profile' from source: play vars 12081 1726882394.70628: variable 'dhcp_interface1' from source: play vars 12081 1726882394.70686: variable 'dhcp_interface1' from source: play vars 12081 1726882394.70692: variable 'controller_profile' from source: play vars 12081 1726882394.70748: variable 'controller_profile' from source: play vars 12081 1726882394.70756: variable 'port2_profile' from source: play vars 12081 1726882394.70814: variable 'port2_profile' from source: play vars 12081 1726882394.70820: variable 'dhcp_interface2' from source: play vars 12081 1726882394.70877: variable 'dhcp_interface2' from source: play vars 12081 1726882394.70885: variable 'controller_profile' from source: play vars 12081 1726882394.70941: variable 'controller_profile' from source: play vars 12081 1726882394.70949: variable 'ansible_distribution' from source: facts 12081 1726882394.70955: variable '__network_rh_distros' from source: role '' defaults 12081 1726882394.70957: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.70987: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882394.71148: variable 'ansible_distribution' from source: facts 12081 1726882394.71154: variable '__network_rh_distros' from source: role '' defaults 12081 1726882394.71157: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.71168: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882394.71325: variable 'ansible_distribution' from source: facts 12081 1726882394.71328: variable '__network_rh_distros' from source: role '' defaults 12081 1726882394.71330: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.71367: variable 'network_provider' from source: set_fact 12081 1726882394.71382: variable 'ansible_facts' from source: unknown 12081 1726882394.71835: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12081 1726882394.71839: when evaluation is False, skipping this task 12081 1726882394.71841: _execute() done 12081 1726882394.71844: dumping result to json 12081 1726882394.71846: done dumping result, returning 12081 1726882394.71854: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-0a3f-ff3c-00000000027f] 12081 1726882394.71857: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027f 12081 1726882394.71943: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000027f 12081 1726882394.71945: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12081 1726882394.71991: no more pending results, returning what we have 12081 1726882394.71995: results queue empty 12081 1726882394.71995: checking for any_errors_fatal 12081 1726882394.72001: done checking for any_errors_fatal 12081 1726882394.72001: checking for max_fail_percentage 12081 1726882394.72003: done checking for max_fail_percentage 12081 1726882394.72004: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.72004: done checking to see if all hosts have failed 12081 1726882394.72005: getting the remaining hosts for this loop 12081 1726882394.72007: done getting the remaining hosts for this loop 12081 1726882394.72010: getting the next task for host managed_node3 12081 1726882394.72017: done getting next task for host managed_node3 12081 1726882394.72021: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882394.72025: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.72038: getting variables 12081 1726882394.72040: in VariableManager get_vars() 12081 1726882394.72079: Calling all_inventory to load vars for managed_node3 12081 1726882394.72082: Calling groups_inventory to load vars for managed_node3 12081 1726882394.72084: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.72094: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.72096: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.72098: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.75609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.76538: done with get_vars() 12081 1726882394.76558: done getting variables 12081 1726882394.76595: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:14 -0400 (0:00:00.216) 0:00:14.569 ****** 12081 1726882394.76616: entering _queue_task() for managed_node3/package 12081 1726882394.76902: worker is 1 (out of 1 available) 12081 1726882394.76916: exiting _queue_task() for managed_node3/package 12081 1726882394.76929: done queuing things up, now waiting for results queue to drain 12081 1726882394.76931: waiting for pending results... 12081 1726882394.77232: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882394.77391: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000280 12081 1726882394.77410: variable 'ansible_search_path' from source: unknown 12081 1726882394.77418: variable 'ansible_search_path' from source: unknown 12081 1726882394.77463: calling self._execute() 12081 1726882394.77557: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.77572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.77586: variable 'omit' from source: magic vars 12081 1726882394.77985: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.78006: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.78125: variable 'network_state' from source: role '' defaults 12081 1726882394.78134: Evaluated conditional (network_state != {}): False 12081 1726882394.78139: when evaluation is False, skipping this task 12081 1726882394.78146: _execute() done 12081 1726882394.78153: dumping result to json 12081 1726882394.78158: done dumping result, returning 12081 1726882394.78176: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-000000000280] 12081 1726882394.78189: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000280 12081 1726882394.78298: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000280 12081 1726882394.78301: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882394.78345: no more pending results, returning what we have 12081 1726882394.78349: results queue empty 12081 1726882394.78349: checking for any_errors_fatal 12081 1726882394.78360: done checking for any_errors_fatal 12081 1726882394.78361: checking for max_fail_percentage 12081 1726882394.78365: done checking for max_fail_percentage 12081 1726882394.78366: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.78368: done checking to see if all hosts have failed 12081 1726882394.78368: getting the remaining hosts for this loop 12081 1726882394.78370: done getting the remaining hosts for this loop 12081 1726882394.78374: getting the next task for host managed_node3 12081 1726882394.78382: done getting next task for host managed_node3 12081 1726882394.78386: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882394.78391: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.78407: getting variables 12081 1726882394.78408: in VariableManager get_vars() 12081 1726882394.78440: Calling all_inventory to load vars for managed_node3 12081 1726882394.78442: Calling groups_inventory to load vars for managed_node3 12081 1726882394.78444: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.78455: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.78458: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.78460: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.79262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.80362: done with get_vars() 12081 1726882394.80385: done getting variables 12081 1726882394.80444: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:14 -0400 (0:00:00.038) 0:00:14.607 ****** 12081 1726882394.80482: entering _queue_task() for managed_node3/package 12081 1726882394.80767: worker is 1 (out of 1 available) 12081 1726882394.80779: exiting _queue_task() for managed_node3/package 12081 1726882394.80793: done queuing things up, now waiting for results queue to drain 12081 1726882394.80794: waiting for pending results... 12081 1726882394.81073: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882394.81218: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000281 12081 1726882394.81241: variable 'ansible_search_path' from source: unknown 12081 1726882394.81249: variable 'ansible_search_path' from source: unknown 12081 1726882394.81295: calling self._execute() 12081 1726882394.81384: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.81393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.81405: variable 'omit' from source: magic vars 12081 1726882394.81777: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.81796: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.81923: variable 'network_state' from source: role '' defaults 12081 1726882394.81937: Evaluated conditional (network_state != {}): False 12081 1726882394.81944: when evaluation is False, skipping this task 12081 1726882394.81953: _execute() done 12081 1726882394.81960: dumping result to json 12081 1726882394.81968: done dumping result, returning 12081 1726882394.81980: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-000000000281] 12081 1726882394.81991: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000281 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882394.82161: no more pending results, returning what we have 12081 1726882394.82167: results queue empty 12081 1726882394.82168: checking for any_errors_fatal 12081 1726882394.82178: done checking for any_errors_fatal 12081 1726882394.82178: checking for max_fail_percentage 12081 1726882394.82181: done checking for max_fail_percentage 12081 1726882394.82182: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.82183: done checking to see if all hosts have failed 12081 1726882394.82184: getting the remaining hosts for this loop 12081 1726882394.82185: done getting the remaining hosts for this loop 12081 1726882394.82189: getting the next task for host managed_node3 12081 1726882394.82197: done getting next task for host managed_node3 12081 1726882394.82202: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882394.82208: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.82224: getting variables 12081 1726882394.82226: in VariableManager get_vars() 12081 1726882394.82268: Calling all_inventory to load vars for managed_node3 12081 1726882394.82271: Calling groups_inventory to load vars for managed_node3 12081 1726882394.82273: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.82286: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.82289: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.82292: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.83441: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000281 12081 1726882394.83445: WORKER PROCESS EXITING 12081 1726882394.84128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.85820: done with get_vars() 12081 1726882394.85854: done getting variables 12081 1726882394.85963: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:14 -0400 (0:00:00.055) 0:00:14.662 ****** 12081 1726882394.86000: entering _queue_task() for managed_node3/service 12081 1726882394.86002: Creating lock for service 12081 1726882394.86341: worker is 1 (out of 1 available) 12081 1726882394.86357: exiting _queue_task() for managed_node3/service 12081 1726882394.86372: done queuing things up, now waiting for results queue to drain 12081 1726882394.86374: waiting for pending results... 12081 1726882394.86672: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882394.86827: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000282 12081 1726882394.86848: variable 'ansible_search_path' from source: unknown 12081 1726882394.86861: variable 'ansible_search_path' from source: unknown 12081 1726882394.86906: calling self._execute() 12081 1726882394.87008: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.87020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.87039: variable 'omit' from source: magic vars 12081 1726882394.87425: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.87444: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.87581: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882394.87793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882394.89447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882394.89506: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882394.89533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882394.89561: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882394.89584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882394.89640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.89668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.89720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.89727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.89740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.89809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.89814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.89838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.89877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.89891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.89926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.89960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.89985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.90140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.90143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.90200: variable 'network_connections' from source: include params 12081 1726882394.90208: variable 'controller_profile' from source: play vars 12081 1726882394.90284: variable 'controller_profile' from source: play vars 12081 1726882394.90294: variable 'controller_device' from source: play vars 12081 1726882394.90358: variable 'controller_device' from source: play vars 12081 1726882394.90375: variable 'port1_profile' from source: play vars 12081 1726882394.90433: variable 'port1_profile' from source: play vars 12081 1726882394.90445: variable 'dhcp_interface1' from source: play vars 12081 1726882394.90501: variable 'dhcp_interface1' from source: play vars 12081 1726882394.90507: variable 'controller_profile' from source: play vars 12081 1726882394.90567: variable 'controller_profile' from source: play vars 12081 1726882394.90575: variable 'port2_profile' from source: play vars 12081 1726882394.90633: variable 'port2_profile' from source: play vars 12081 1726882394.90639: variable 'dhcp_interface2' from source: play vars 12081 1726882394.90701: variable 'dhcp_interface2' from source: play vars 12081 1726882394.90707: variable 'controller_profile' from source: play vars 12081 1726882394.90770: variable 'controller_profile' from source: play vars 12081 1726882394.90836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882394.91024: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882394.91059: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882394.91092: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882394.91121: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882394.91168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882394.91187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882394.91211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.91236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882394.91307: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882394.91497: variable 'network_connections' from source: include params 12081 1726882394.91500: variable 'controller_profile' from source: play vars 12081 1726882394.91542: variable 'controller_profile' from source: play vars 12081 1726882394.91548: variable 'controller_device' from source: play vars 12081 1726882394.91596: variable 'controller_device' from source: play vars 12081 1726882394.91605: variable 'port1_profile' from source: play vars 12081 1726882394.91652: variable 'port1_profile' from source: play vars 12081 1726882394.91660: variable 'dhcp_interface1' from source: play vars 12081 1726882394.91705: variable 'dhcp_interface1' from source: play vars 12081 1726882394.91710: variable 'controller_profile' from source: play vars 12081 1726882394.91751: variable 'controller_profile' from source: play vars 12081 1726882394.91759: variable 'port2_profile' from source: play vars 12081 1726882394.91805: variable 'port2_profile' from source: play vars 12081 1726882394.91811: variable 'dhcp_interface2' from source: play vars 12081 1726882394.91853: variable 'dhcp_interface2' from source: play vars 12081 1726882394.91861: variable 'controller_profile' from source: play vars 12081 1726882394.91908: variable 'controller_profile' from source: play vars 12081 1726882394.91931: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882394.91934: when evaluation is False, skipping this task 12081 1726882394.91937: _execute() done 12081 1726882394.91939: dumping result to json 12081 1726882394.91941: done dumping result, returning 12081 1726882394.91949: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000282] 12081 1726882394.91957: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000282 12081 1726882394.92044: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000282 12081 1726882394.92046: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882394.92089: no more pending results, returning what we have 12081 1726882394.92093: results queue empty 12081 1726882394.92094: checking for any_errors_fatal 12081 1726882394.92100: done checking for any_errors_fatal 12081 1726882394.92101: checking for max_fail_percentage 12081 1726882394.92103: done checking for max_fail_percentage 12081 1726882394.92103: checking to see if all hosts have failed and the running result is not ok 12081 1726882394.92104: done checking to see if all hosts have failed 12081 1726882394.92105: getting the remaining hosts for this loop 12081 1726882394.92107: done getting the remaining hosts for this loop 12081 1726882394.92110: getting the next task for host managed_node3 12081 1726882394.92118: done getting next task for host managed_node3 12081 1726882394.92121: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882394.92126: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882394.92140: getting variables 12081 1726882394.92142: in VariableManager get_vars() 12081 1726882394.92177: Calling all_inventory to load vars for managed_node3 12081 1726882394.92180: Calling groups_inventory to load vars for managed_node3 12081 1726882394.92182: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882394.92193: Calling all_plugins_play to load vars for managed_node3 12081 1726882394.92196: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882394.92198: Calling groups_plugins_play to load vars for managed_node3 12081 1726882394.93034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882394.94721: done with get_vars() 12081 1726882394.94744: done getting variables 12081 1726882394.94807: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:14 -0400 (0:00:00.088) 0:00:14.751 ****** 12081 1726882394.94838: entering _queue_task() for managed_node3/service 12081 1726882394.95291: worker is 1 (out of 1 available) 12081 1726882394.95305: exiting _queue_task() for managed_node3/service 12081 1726882394.95323: done queuing things up, now waiting for results queue to drain 12081 1726882394.95325: waiting for pending results... 12081 1726882394.95520: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882394.95609: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000283 12081 1726882394.95620: variable 'ansible_search_path' from source: unknown 12081 1726882394.95623: variable 'ansible_search_path' from source: unknown 12081 1726882394.95658: calling self._execute() 12081 1726882394.95724: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882394.95728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882394.95737: variable 'omit' from source: magic vars 12081 1726882394.96008: variable 'ansible_distribution_major_version' from source: facts 12081 1726882394.96019: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882394.96127: variable 'network_provider' from source: set_fact 12081 1726882394.96130: variable 'network_state' from source: role '' defaults 12081 1726882394.96141: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12081 1726882394.96147: variable 'omit' from source: magic vars 12081 1726882394.96188: variable 'omit' from source: magic vars 12081 1726882394.96207: variable 'network_service_name' from source: role '' defaults 12081 1726882394.96256: variable 'network_service_name' from source: role '' defaults 12081 1726882394.96328: variable '__network_provider_setup' from source: role '' defaults 12081 1726882394.96332: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882394.96379: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882394.96386: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882394.96432: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882394.96579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882394.98475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882394.98525: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882394.98555: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882394.98594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882394.98611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882394.98672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.98716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.98742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.98793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.98816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.98867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.98936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.98970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.99027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.99047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.99313: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882394.99445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882394.99484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882394.99513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882394.99604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882394.99631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882394.99818: variable 'ansible_python' from source: facts 12081 1726882394.99856: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882394.99938: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882395.00020: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882395.00105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882395.00124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882395.00147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882395.00175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882395.00185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882395.00221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882395.00240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882395.00257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882395.00289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882395.00299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882395.00395: variable 'network_connections' from source: include params 12081 1726882395.00401: variable 'controller_profile' from source: play vars 12081 1726882395.00457: variable 'controller_profile' from source: play vars 12081 1726882395.00467: variable 'controller_device' from source: play vars 12081 1726882395.00516: variable 'controller_device' from source: play vars 12081 1726882395.00530: variable 'port1_profile' from source: play vars 12081 1726882395.00585: variable 'port1_profile' from source: play vars 12081 1726882395.00594: variable 'dhcp_interface1' from source: play vars 12081 1726882395.00648: variable 'dhcp_interface1' from source: play vars 12081 1726882395.00656: variable 'controller_profile' from source: play vars 12081 1726882395.00706: variable 'controller_profile' from source: play vars 12081 1726882395.00715: variable 'port2_profile' from source: play vars 12081 1726882395.00770: variable 'port2_profile' from source: play vars 12081 1726882395.00777: variable 'dhcp_interface2' from source: play vars 12081 1726882395.00855: variable 'dhcp_interface2' from source: play vars 12081 1726882395.00882: variable 'controller_profile' from source: play vars 12081 1726882395.00950: variable 'controller_profile' from source: play vars 12081 1726882395.01074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882395.01315: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882395.01372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882395.01430: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882395.01479: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882395.01554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882395.01591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882395.01640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882395.01686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882395.01786: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882395.02134: variable 'network_connections' from source: include params 12081 1726882395.02145: variable 'controller_profile' from source: play vars 12081 1726882395.02235: variable 'controller_profile' from source: play vars 12081 1726882395.02257: variable 'controller_device' from source: play vars 12081 1726882395.02349: variable 'controller_device' from source: play vars 12081 1726882395.02377: variable 'port1_profile' from source: play vars 12081 1726882395.02470: variable 'port1_profile' from source: play vars 12081 1726882395.02486: variable 'dhcp_interface1' from source: play vars 12081 1726882395.02580: variable 'dhcp_interface1' from source: play vars 12081 1726882395.02595: variable 'controller_profile' from source: play vars 12081 1726882395.02683: variable 'controller_profile' from source: play vars 12081 1726882395.02697: variable 'port2_profile' from source: play vars 12081 1726882395.02787: variable 'port2_profile' from source: play vars 12081 1726882395.02804: variable 'dhcp_interface2' from source: play vars 12081 1726882395.02983: variable 'dhcp_interface2' from source: play vars 12081 1726882395.03008: variable 'controller_profile' from source: play vars 12081 1726882395.03119: variable 'controller_profile' from source: play vars 12081 1726882395.03159: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882395.03243: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882395.03437: variable 'network_connections' from source: include params 12081 1726882395.03448: variable 'controller_profile' from source: play vars 12081 1726882395.03499: variable 'controller_profile' from source: play vars 12081 1726882395.03508: variable 'controller_device' from source: play vars 12081 1726882395.03555: variable 'controller_device' from source: play vars 12081 1726882395.03565: variable 'port1_profile' from source: play vars 12081 1726882395.03618: variable 'port1_profile' from source: play vars 12081 1726882395.03621: variable 'dhcp_interface1' from source: play vars 12081 1726882395.03668: variable 'dhcp_interface1' from source: play vars 12081 1726882395.03675: variable 'controller_profile' from source: play vars 12081 1726882395.03726: variable 'controller_profile' from source: play vars 12081 1726882395.03729: variable 'port2_profile' from source: play vars 12081 1726882395.03778: variable 'port2_profile' from source: play vars 12081 1726882395.03784: variable 'dhcp_interface2' from source: play vars 12081 1726882395.03835: variable 'dhcp_interface2' from source: play vars 12081 1726882395.03838: variable 'controller_profile' from source: play vars 12081 1726882395.03889: variable 'controller_profile' from source: play vars 12081 1726882395.03907: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882395.03966: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882395.04161: variable 'network_connections' from source: include params 12081 1726882395.04166: variable 'controller_profile' from source: play vars 12081 1726882395.04474: variable 'controller_profile' from source: play vars 12081 1726882395.04478: variable 'controller_device' from source: play vars 12081 1726882395.04480: variable 'controller_device' from source: play vars 12081 1726882395.04482: variable 'port1_profile' from source: play vars 12081 1726882395.04872: variable 'port1_profile' from source: play vars 12081 1726882395.04876: variable 'dhcp_interface1' from source: play vars 12081 1726882395.04879: variable 'dhcp_interface1' from source: play vars 12081 1726882395.04882: variable 'controller_profile' from source: play vars 12081 1726882395.04884: variable 'controller_profile' from source: play vars 12081 1726882395.04886: variable 'port2_profile' from source: play vars 12081 1726882395.04888: variable 'port2_profile' from source: play vars 12081 1726882395.04890: variable 'dhcp_interface2' from source: play vars 12081 1726882395.04891: variable 'dhcp_interface2' from source: play vars 12081 1726882395.04893: variable 'controller_profile' from source: play vars 12081 1726882395.04895: variable 'controller_profile' from source: play vars 12081 1726882395.04983: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882395.04991: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882395.05091: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882395.05094: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882395.05247: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882395.05705: variable 'network_connections' from source: include params 12081 1726882395.05709: variable 'controller_profile' from source: play vars 12081 1726882395.05767: variable 'controller_profile' from source: play vars 12081 1726882395.05774: variable 'controller_device' from source: play vars 12081 1726882395.05830: variable 'controller_device' from source: play vars 12081 1726882395.05842: variable 'port1_profile' from source: play vars 12081 1726882395.05900: variable 'port1_profile' from source: play vars 12081 1726882395.05906: variable 'dhcp_interface1' from source: play vars 12081 1726882395.05961: variable 'dhcp_interface1' from source: play vars 12081 1726882395.05969: variable 'controller_profile' from source: play vars 12081 1726882395.06027: variable 'controller_profile' from source: play vars 12081 1726882395.06033: variable 'port2_profile' from source: play vars 12081 1726882395.06090: variable 'port2_profile' from source: play vars 12081 1726882395.06096: variable 'dhcp_interface2' from source: play vars 12081 1726882395.06155: variable 'dhcp_interface2' from source: play vars 12081 1726882395.06159: variable 'controller_profile' from source: play vars 12081 1726882395.06216: variable 'controller_profile' from source: play vars 12081 1726882395.06225: variable 'ansible_distribution' from source: facts 12081 1726882395.06227: variable '__network_rh_distros' from source: role '' defaults 12081 1726882395.06233: variable 'ansible_distribution_major_version' from source: facts 12081 1726882395.06259: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882395.06428: variable 'ansible_distribution' from source: facts 12081 1726882395.06431: variable '__network_rh_distros' from source: role '' defaults 12081 1726882395.06437: variable 'ansible_distribution_major_version' from source: facts 12081 1726882395.06449: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882395.06615: variable 'ansible_distribution' from source: facts 12081 1726882395.06619: variable '__network_rh_distros' from source: role '' defaults 12081 1726882395.06623: variable 'ansible_distribution_major_version' from source: facts 12081 1726882395.06657: variable 'network_provider' from source: set_fact 12081 1726882395.06680: variable 'omit' from source: magic vars 12081 1726882395.06708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882395.06735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882395.06753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882395.06767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882395.06778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882395.06807: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882395.06811: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882395.06813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882395.06910: Set connection var ansible_pipelining to False 12081 1726882395.06913: Set connection var ansible_shell_type to sh 12081 1726882395.06920: Set connection var ansible_shell_executable to /bin/sh 12081 1726882395.06923: Set connection var ansible_connection to ssh 12081 1726882395.06928: Set connection var ansible_timeout to 10 12081 1726882395.06933: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882395.06958: variable 'ansible_shell_executable' from source: unknown 12081 1726882395.06961: variable 'ansible_connection' from source: unknown 12081 1726882395.06965: variable 'ansible_module_compression' from source: unknown 12081 1726882395.06968: variable 'ansible_shell_type' from source: unknown 12081 1726882395.06970: variable 'ansible_shell_executable' from source: unknown 12081 1726882395.06972: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882395.06977: variable 'ansible_pipelining' from source: unknown 12081 1726882395.06979: variable 'ansible_timeout' from source: unknown 12081 1726882395.06983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882395.07092: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882395.07102: variable 'omit' from source: magic vars 12081 1726882395.07108: starting attempt loop 12081 1726882395.07111: running the handler 12081 1726882395.07190: variable 'ansible_facts' from source: unknown 12081 1726882395.07969: _low_level_execute_command(): starting 12081 1726882395.07976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882395.08689: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882395.08698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.08708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882395.08722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.08762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882395.08770: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882395.08780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.08794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882395.08801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882395.08805: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882395.08814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.08822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882395.08833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.08840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882395.08846: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882395.08855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.08933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882395.08955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882395.08966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882395.09105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882395.10824: stdout chunk (state=3): >>>/root <<< 12081 1726882395.10992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882395.10995: stdout chunk (state=3): >>><<< 12081 1726882395.11006: stderr chunk (state=3): >>><<< 12081 1726882395.11026: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882395.11037: _low_level_execute_command(): starting 12081 1726882395.11043: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640 `" && echo ansible-tmp-1726882395.1102545-12736-35210861330640="` echo /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640 `" ) && sleep 0' 12081 1726882395.11656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882395.11663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.11676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882395.11691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.11731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882395.11737: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882395.11746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.11760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882395.11771: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882395.11778: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882395.11785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.11794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882395.11805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.11812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882395.11818: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882395.11827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.11898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882395.11912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882395.11922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882395.12059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882395.13970: stdout chunk (state=3): >>>ansible-tmp-1726882395.1102545-12736-35210861330640=/root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640 <<< 12081 1726882395.14078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882395.14172: stderr chunk (state=3): >>><<< 12081 1726882395.14185: stdout chunk (state=3): >>><<< 12081 1726882395.14473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882395.1102545-12736-35210861330640=/root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882395.14477: variable 'ansible_module_compression' from source: unknown 12081 1726882395.14480: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 12081 1726882395.14483: ANSIBALLZ: Acquiring lock 12081 1726882395.14486: ANSIBALLZ: Lock acquired: 139893497835168 12081 1726882395.14488: ANSIBALLZ: Creating module 12081 1726882395.52327: ANSIBALLZ: Writing module into payload 12081 1726882395.52534: ANSIBALLZ: Writing module 12081 1726882395.52572: ANSIBALLZ: Renaming module 12081 1726882395.52581: ANSIBALLZ: Done creating module 12081 1726882395.52618: variable 'ansible_facts' from source: unknown 12081 1726882395.52816: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640/AnsiballZ_systemd.py 12081 1726882395.52967: Sending initial data 12081 1726882395.52970: Sent initial data (155 bytes) 12081 1726882395.54327: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882395.54331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.54370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882395.54384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.54387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.54433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882395.54444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882395.54567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882395.56444: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882395.56540: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882395.56634: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpzeinf1i2 /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640/AnsiballZ_systemd.py <<< 12081 1726882395.56734: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882395.58723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882395.58829: stderr chunk (state=3): >>><<< 12081 1726882395.58833: stdout chunk (state=3): >>><<< 12081 1726882395.58849: done transferring module to remote 12081 1726882395.58861: _low_level_execute_command(): starting 12081 1726882395.58867: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640/ /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640/AnsiballZ_systemd.py && sleep 0' 12081 1726882395.59317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.59323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.59355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882395.59370: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.59381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.59425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882395.59436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882395.59542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882395.61301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882395.61355: stderr chunk (state=3): >>><<< 12081 1726882395.61358: stdout chunk (state=3): >>><<< 12081 1726882395.61373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882395.61376: _low_level_execute_command(): starting 12081 1726882395.61385: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640/AnsiballZ_systemd.py && sleep 0' 12081 1726882395.61835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.61840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.61875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882395.61888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.61942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882395.61948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882395.62075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882395.87041: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 12081 1726882395.87061: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "13828096", "MemoryAvailable": "infinity", "CPUUsageNSec": "652656000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 12081 1726882395.87076: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12081 1726882395.88595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882395.88599: stdout chunk (state=3): >>><<< 12081 1726882395.88604: stderr chunk (state=3): >>><<< 12081 1726882395.88623: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13828096", "MemoryAvailable": "infinity", "CPUUsageNSec": "652656000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882395.88800: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882395.88820: _low_level_execute_command(): starting 12081 1726882395.88825: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882395.1102545-12736-35210861330640/ > /dev/null 2>&1 && sleep 0' 12081 1726882395.89481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882395.89506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.89509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882395.89511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.89550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882395.89556: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882395.89566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.89583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882395.89590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882395.89597: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882395.89604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882395.89612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882395.89623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882395.89630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882395.89636: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882395.89645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882395.89728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882395.89735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882395.89738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882395.89881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882395.91767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882395.91771: stderr chunk (state=3): >>><<< 12081 1726882395.91774: stdout chunk (state=3): >>><<< 12081 1726882395.91793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882395.91799: handler run complete 12081 1726882395.91862: attempt loop complete, returning result 12081 1726882395.91868: _execute() done 12081 1726882395.91871: dumping result to json 12081 1726882395.91888: done dumping result, returning 12081 1726882395.91898: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-0a3f-ff3c-000000000283] 12081 1726882395.91903: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000283 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882395.92225: no more pending results, returning what we have 12081 1726882395.92228: results queue empty 12081 1726882395.92229: checking for any_errors_fatal 12081 1726882395.92241: done checking for any_errors_fatal 12081 1726882395.92242: checking for max_fail_percentage 12081 1726882395.92244: done checking for max_fail_percentage 12081 1726882395.92245: checking to see if all hosts have failed and the running result is not ok 12081 1726882395.92246: done checking to see if all hosts have failed 12081 1726882395.92247: getting the remaining hosts for this loop 12081 1726882395.92249: done getting the remaining hosts for this loop 12081 1726882395.92253: getting the next task for host managed_node3 12081 1726882395.92262: done getting next task for host managed_node3 12081 1726882395.92268: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882395.92274: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882395.92287: getting variables 12081 1726882395.92290: in VariableManager get_vars() 12081 1726882395.92324: Calling all_inventory to load vars for managed_node3 12081 1726882395.92327: Calling groups_inventory to load vars for managed_node3 12081 1726882395.92330: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882395.92342: Calling all_plugins_play to load vars for managed_node3 12081 1726882395.92345: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882395.92348: Calling groups_plugins_play to load vars for managed_node3 12081 1726882395.92970: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000283 12081 1726882395.92978: WORKER PROCESS EXITING 12081 1726882395.94018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882395.97221: done with get_vars() 12081 1726882395.97539: done getting variables 12081 1726882395.97609: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:15 -0400 (0:00:01.028) 0:00:15.779 ****** 12081 1726882395.97648: entering _queue_task() for managed_node3/service 12081 1726882395.98517: worker is 1 (out of 1 available) 12081 1726882395.98530: exiting _queue_task() for managed_node3/service 12081 1726882395.98542: done queuing things up, now waiting for results queue to drain 12081 1726882395.98543: waiting for pending results... 12081 1726882395.99646: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882395.99919: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000284 12081 1726882395.99929: variable 'ansible_search_path' from source: unknown 12081 1726882395.99932: variable 'ansible_search_path' from source: unknown 12081 1726882396.00193: calling self._execute() 12081 1726882396.00400: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882396.00404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882396.00415: variable 'omit' from source: magic vars 12081 1726882396.01123: variable 'ansible_distribution_major_version' from source: facts 12081 1726882396.01135: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882396.01273: variable 'network_provider' from source: set_fact 12081 1726882396.01287: Evaluated conditional (network_provider == "nm"): True 12081 1726882396.01412: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882396.01494: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882396.01734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882396.04030: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882396.04076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882396.04103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882396.04128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882396.04147: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882396.04208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882396.04228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882396.04245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882396.04281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882396.04292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882396.04324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882396.04340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882396.04359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882396.04388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882396.04398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882396.04427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882396.04443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882396.04460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882396.04488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882396.04498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882396.04599: variable 'network_connections' from source: include params 12081 1726882396.04607: variable 'controller_profile' from source: play vars 12081 1726882396.04657: variable 'controller_profile' from source: play vars 12081 1726882396.04667: variable 'controller_device' from source: play vars 12081 1726882396.04723: variable 'controller_device' from source: play vars 12081 1726882396.04747: variable 'port1_profile' from source: play vars 12081 1726882396.04823: variable 'port1_profile' from source: play vars 12081 1726882396.04830: variable 'dhcp_interface1' from source: play vars 12081 1726882396.04891: variable 'dhcp_interface1' from source: play vars 12081 1726882396.04907: variable 'controller_profile' from source: play vars 12081 1726882396.05239: variable 'controller_profile' from source: play vars 12081 1726882396.05253: variable 'port2_profile' from source: play vars 12081 1726882396.05316: variable 'port2_profile' from source: play vars 12081 1726882396.05332: variable 'dhcp_interface2' from source: play vars 12081 1726882396.05399: variable 'dhcp_interface2' from source: play vars 12081 1726882396.05409: variable 'controller_profile' from source: play vars 12081 1726882396.05496: variable 'controller_profile' from source: play vars 12081 1726882396.05579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882396.05800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882396.05839: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882396.05879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882396.05916: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882396.05968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882396.05995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882396.06027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882396.06078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882396.06166: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882396.06439: variable 'network_connections' from source: include params 12081 1726882396.06451: variable 'controller_profile' from source: play vars 12081 1726882396.06519: variable 'controller_profile' from source: play vars 12081 1726882396.06532: variable 'controller_device' from source: play vars 12081 1726882396.06600: variable 'controller_device' from source: play vars 12081 1726882396.06617: variable 'port1_profile' from source: play vars 12081 1726882396.06686: variable 'port1_profile' from source: play vars 12081 1726882396.06699: variable 'dhcp_interface1' from source: play vars 12081 1726882396.06760: variable 'dhcp_interface1' from source: play vars 12081 1726882396.06776: variable 'controller_profile' from source: play vars 12081 1726882396.06841: variable 'controller_profile' from source: play vars 12081 1726882396.06854: variable 'port2_profile' from source: play vars 12081 1726882396.06923: variable 'port2_profile' from source: play vars 12081 1726882396.06935: variable 'dhcp_interface2' from source: play vars 12081 1726882396.07001: variable 'dhcp_interface2' from source: play vars 12081 1726882396.07015: variable 'controller_profile' from source: play vars 12081 1726882396.07082: variable 'controller_profile' from source: play vars 12081 1726882396.07138: Evaluated conditional (__network_wpa_supplicant_required): False 12081 1726882396.07147: when evaluation is False, skipping this task 12081 1726882396.07157: _execute() done 12081 1726882396.07166: dumping result to json 12081 1726882396.07175: done dumping result, returning 12081 1726882396.07187: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-0a3f-ff3c-000000000284] 12081 1726882396.07196: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000284 12081 1726882396.07315: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000284 12081 1726882396.07322: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12081 1726882396.07378: no more pending results, returning what we have 12081 1726882396.07383: results queue empty 12081 1726882396.07384: checking for any_errors_fatal 12081 1726882396.07404: done checking for any_errors_fatal 12081 1726882396.07405: checking for max_fail_percentage 12081 1726882396.07407: done checking for max_fail_percentage 12081 1726882396.07408: checking to see if all hosts have failed and the running result is not ok 12081 1726882396.07409: done checking to see if all hosts have failed 12081 1726882396.07410: getting the remaining hosts for this loop 12081 1726882396.07412: done getting the remaining hosts for this loop 12081 1726882396.07416: getting the next task for host managed_node3 12081 1726882396.07425: done getting next task for host managed_node3 12081 1726882396.07429: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882396.07435: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882396.07449: getting variables 12081 1726882396.07454: in VariableManager get_vars() 12081 1726882396.07496: Calling all_inventory to load vars for managed_node3 12081 1726882396.07499: Calling groups_inventory to load vars for managed_node3 12081 1726882396.07502: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882396.07515: Calling all_plugins_play to load vars for managed_node3 12081 1726882396.07518: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882396.07521: Calling groups_plugins_play to load vars for managed_node3 12081 1726882396.10021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882396.12462: done with get_vars() 12081 1726882396.12491: done getting variables 12081 1726882396.12554: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:16 -0400 (0:00:00.149) 0:00:15.928 ****** 12081 1726882396.12591: entering _queue_task() for managed_node3/service 12081 1726882396.12923: worker is 1 (out of 1 available) 12081 1726882396.12934: exiting _queue_task() for managed_node3/service 12081 1726882396.12947: done queuing things up, now waiting for results queue to drain 12081 1726882396.12949: waiting for pending results... 12081 1726882396.13316: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882396.13787: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000285 12081 1726882396.13811: variable 'ansible_search_path' from source: unknown 12081 1726882396.13820: variable 'ansible_search_path' from source: unknown 12081 1726882396.13866: calling self._execute() 12081 1726882396.13959: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882396.13977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882396.14000: variable 'omit' from source: magic vars 12081 1726882396.14397: variable 'ansible_distribution_major_version' from source: facts 12081 1726882396.14417: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882396.14543: variable 'network_provider' from source: set_fact 12081 1726882396.14557: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882396.14574: when evaluation is False, skipping this task 12081 1726882396.14593: _execute() done 12081 1726882396.14601: dumping result to json 12081 1726882396.14609: done dumping result, returning 12081 1726882396.14620: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-0a3f-ff3c-000000000285] 12081 1726882396.14634: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000285 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882396.14911: no more pending results, returning what we have 12081 1726882396.14915: results queue empty 12081 1726882396.14916: checking for any_errors_fatal 12081 1726882396.14927: done checking for any_errors_fatal 12081 1726882396.14928: checking for max_fail_percentage 12081 1726882396.14930: done checking for max_fail_percentage 12081 1726882396.14931: checking to see if all hosts have failed and the running result is not ok 12081 1726882396.14932: done checking to see if all hosts have failed 12081 1726882396.14932: getting the remaining hosts for this loop 12081 1726882396.14934: done getting the remaining hosts for this loop 12081 1726882396.14938: getting the next task for host managed_node3 12081 1726882396.14946: done getting next task for host managed_node3 12081 1726882396.14953: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882396.14959: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882396.14979: getting variables 12081 1726882396.14982: in VariableManager get_vars() 12081 1726882396.15018: Calling all_inventory to load vars for managed_node3 12081 1726882396.15020: Calling groups_inventory to load vars for managed_node3 12081 1726882396.15023: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882396.15036: Calling all_plugins_play to load vars for managed_node3 12081 1726882396.15039: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882396.15042: Calling groups_plugins_play to load vars for managed_node3 12081 1726882396.16271: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000285 12081 1726882396.16275: WORKER PROCESS EXITING 12081 1726882396.16946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882396.19365: done with get_vars() 12081 1726882396.19502: done getting variables 12081 1726882396.19567: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:16 -0400 (0:00:00.070) 0:00:15.998 ****** 12081 1726882396.19601: entering _queue_task() for managed_node3/copy 12081 1726882396.19915: worker is 1 (out of 1 available) 12081 1726882396.19927: exiting _queue_task() for managed_node3/copy 12081 1726882396.19940: done queuing things up, now waiting for results queue to drain 12081 1726882396.19941: waiting for pending results... 12081 1726882396.20924: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882396.21120: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000286 12081 1726882396.21140: variable 'ansible_search_path' from source: unknown 12081 1726882396.21149: variable 'ansible_search_path' from source: unknown 12081 1726882396.21207: calling self._execute() 12081 1726882396.21296: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882396.21310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882396.21324: variable 'omit' from source: magic vars 12081 1726882396.21710: variable 'ansible_distribution_major_version' from source: facts 12081 1726882396.21729: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882396.21862: variable 'network_provider' from source: set_fact 12081 1726882396.21876: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882396.21884: when evaluation is False, skipping this task 12081 1726882396.21891: _execute() done 12081 1726882396.21898: dumping result to json 12081 1726882396.21906: done dumping result, returning 12081 1726882396.21919: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-0a3f-ff3c-000000000286] 12081 1726882396.21933: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000286 12081 1726882396.22060: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000286 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12081 1726882396.22114: no more pending results, returning what we have 12081 1726882396.22118: results queue empty 12081 1726882396.22119: checking for any_errors_fatal 12081 1726882396.22125: done checking for any_errors_fatal 12081 1726882396.22125: checking for max_fail_percentage 12081 1726882396.22127: done checking for max_fail_percentage 12081 1726882396.22128: checking to see if all hosts have failed and the running result is not ok 12081 1726882396.22130: done checking to see if all hosts have failed 12081 1726882396.22131: getting the remaining hosts for this loop 12081 1726882396.22133: done getting the remaining hosts for this loop 12081 1726882396.22137: getting the next task for host managed_node3 12081 1726882396.22146: done getting next task for host managed_node3 12081 1726882396.22150: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882396.22159: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882396.22176: getting variables 12081 1726882396.22179: in VariableManager get_vars() 12081 1726882396.22214: Calling all_inventory to load vars for managed_node3 12081 1726882396.22217: Calling groups_inventory to load vars for managed_node3 12081 1726882396.22220: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882396.22234: Calling all_plugins_play to load vars for managed_node3 12081 1726882396.22237: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882396.22241: Calling groups_plugins_play to load vars for managed_node3 12081 1726882396.23282: WORKER PROCESS EXITING 12081 1726882396.23964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882396.25715: done with get_vars() 12081 1726882396.25740: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:16 -0400 (0:00:00.062) 0:00:16.061 ****** 12081 1726882396.25831: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882396.25832: Creating lock for fedora.linux_system_roles.network_connections 12081 1726882396.26133: worker is 1 (out of 1 available) 12081 1726882396.26145: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882396.26161: done queuing things up, now waiting for results queue to drain 12081 1726882396.26165: waiting for pending results... 12081 1726882396.26439: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882396.26593: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000287 12081 1726882396.26618: variable 'ansible_search_path' from source: unknown 12081 1726882396.26626: variable 'ansible_search_path' from source: unknown 12081 1726882396.26671: calling self._execute() 12081 1726882396.26766: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882396.26779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882396.26793: variable 'omit' from source: magic vars 12081 1726882396.27177: variable 'ansible_distribution_major_version' from source: facts 12081 1726882396.27196: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882396.27207: variable 'omit' from source: magic vars 12081 1726882396.27276: variable 'omit' from source: magic vars 12081 1726882396.27438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882396.29836: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882396.29915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882396.29958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882396.30000: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882396.30031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882396.30115: variable 'network_provider' from source: set_fact 12081 1726882396.30254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882396.30294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882396.30324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882396.30372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882396.30394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882396.30474: variable 'omit' from source: magic vars 12081 1726882396.30588: variable 'omit' from source: magic vars 12081 1726882396.30716: variable 'network_connections' from source: include params 12081 1726882396.30736: variable 'controller_profile' from source: play vars 12081 1726882396.30802: variable 'controller_profile' from source: play vars 12081 1726882396.30817: variable 'controller_device' from source: play vars 12081 1726882396.30883: variable 'controller_device' from source: play vars 12081 1726882396.30903: variable 'port1_profile' from source: play vars 12081 1726882396.30970: variable 'port1_profile' from source: play vars 12081 1726882396.30982: variable 'dhcp_interface1' from source: play vars 12081 1726882396.31041: variable 'dhcp_interface1' from source: play vars 12081 1726882396.31060: variable 'controller_profile' from source: play vars 12081 1726882396.31121: variable 'controller_profile' from source: play vars 12081 1726882396.31135: variable 'port2_profile' from source: play vars 12081 1726882396.31201: variable 'port2_profile' from source: play vars 12081 1726882396.31213: variable 'dhcp_interface2' from source: play vars 12081 1726882396.31281: variable 'dhcp_interface2' from source: play vars 12081 1726882396.31292: variable 'controller_profile' from source: play vars 12081 1726882396.31355: variable 'controller_profile' from source: play vars 12081 1726882396.31589: variable 'omit' from source: magic vars 12081 1726882396.31607: variable '__lsr_ansible_managed' from source: task vars 12081 1726882396.31674: variable '__lsr_ansible_managed' from source: task vars 12081 1726882396.31870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12081 1726882396.32414: Loaded config def from plugin (lookup/template) 12081 1726882396.32425: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12081 1726882396.32459: File lookup term: get_ansible_managed.j2 12081 1726882396.32473: variable 'ansible_search_path' from source: unknown 12081 1726882396.32483: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12081 1726882396.32501: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12081 1726882396.32524: variable 'ansible_search_path' from source: unknown 12081 1726882396.41359: variable 'ansible_managed' from source: unknown 12081 1726882396.41528: variable 'omit' from source: magic vars 12081 1726882396.41568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882396.41598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882396.41619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882396.41645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882396.41665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882396.41698: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882396.41708: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882396.41716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882396.41822: Set connection var ansible_pipelining to False 12081 1726882396.41830: Set connection var ansible_shell_type to sh 12081 1726882396.41842: Set connection var ansible_shell_executable to /bin/sh 12081 1726882396.41849: Set connection var ansible_connection to ssh 12081 1726882396.41869: Set connection var ansible_timeout to 10 12081 1726882396.41888: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882396.41918: variable 'ansible_shell_executable' from source: unknown 12081 1726882396.41927: variable 'ansible_connection' from source: unknown 12081 1726882396.41934: variable 'ansible_module_compression' from source: unknown 12081 1726882396.41941: variable 'ansible_shell_type' from source: unknown 12081 1726882396.41948: variable 'ansible_shell_executable' from source: unknown 12081 1726882396.41957: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882396.41970: variable 'ansible_pipelining' from source: unknown 12081 1726882396.41977: variable 'ansible_timeout' from source: unknown 12081 1726882396.41985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882396.42126: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882396.42142: variable 'omit' from source: magic vars 12081 1726882396.42158: starting attempt loop 12081 1726882396.42167: running the handler 12081 1726882396.42187: _low_level_execute_command(): starting 12081 1726882396.42201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882396.42976: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882396.42993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.43008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.43025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.43075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.43089: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882396.43104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.43122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882396.43134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882396.43146: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882396.43163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.43184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.43201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.43214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.43226: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882396.43240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.43323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882396.43348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882396.43370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882396.43513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882396.45348: stdout chunk (state=3): >>>/root <<< 12081 1726882396.45455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882396.45556: stderr chunk (state=3): >>><<< 12081 1726882396.45572: stdout chunk (state=3): >>><<< 12081 1726882396.45671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882396.45675: _low_level_execute_command(): starting 12081 1726882396.45678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362 `" && echo ansible-tmp-1726882396.456058-12810-9204074884362="` echo /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362 `" ) && sleep 0' 12081 1726882396.46337: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882396.46353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.46372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.46392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.46437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.46449: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882396.46468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.46489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882396.46501: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882396.46510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882396.46522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.46534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.46556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.46569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.46583: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882396.46600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.46684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882396.46709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882396.46724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882396.46856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882396.48811: stdout chunk (state=3): >>>ansible-tmp-1726882396.456058-12810-9204074884362=/root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362 <<< 12081 1726882396.48923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882396.49018: stderr chunk (state=3): >>><<< 12081 1726882396.49028: stdout chunk (state=3): >>><<< 12081 1726882396.49173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882396.456058-12810-9204074884362=/root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882396.49176: variable 'ansible_module_compression' from source: unknown 12081 1726882396.49283: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 12081 1726882396.49286: ANSIBALLZ: Acquiring lock 12081 1726882396.49289: ANSIBALLZ: Lock acquired: 139893498592144 12081 1726882396.49291: ANSIBALLZ: Creating module 12081 1726882396.73807: ANSIBALLZ: Writing module into payload 12081 1726882396.74349: ANSIBALLZ: Writing module 12081 1726882396.74390: ANSIBALLZ: Renaming module 12081 1726882396.74402: ANSIBALLZ: Done creating module 12081 1726882396.74437: variable 'ansible_facts' from source: unknown 12081 1726882396.74547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362/AnsiballZ_network_connections.py 12081 1726882396.74720: Sending initial data 12081 1726882396.74723: Sent initial data (165 bytes) 12081 1726882396.75798: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882396.75813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.75827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.75857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.75902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.75915: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882396.75929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.75953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882396.75973: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882396.75987: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882396.76001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.76016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.76032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.76045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.76062: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882396.76080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.76155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882396.76180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882396.76198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882396.76329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882396.78358: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882396.78448: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882396.78554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmplv4raxy1 /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362/AnsiballZ_network_connections.py <<< 12081 1726882396.78648: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882396.80467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882396.80787: stderr chunk (state=3): >>><<< 12081 1726882396.80791: stdout chunk (state=3): >>><<< 12081 1726882396.80793: done transferring module to remote 12081 1726882396.80795: _low_level_execute_command(): starting 12081 1726882396.80797: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362/ /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362/AnsiballZ_network_connections.py && sleep 0' 12081 1726882396.81454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882396.81480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.81495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.81513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.81557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.81577: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882396.81592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.81609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882396.81620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882396.81630: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882396.81641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.81654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.81673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.81694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.81704: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882396.81717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.81802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882396.81827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882396.81848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882396.81986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882396.83822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882396.83924: stderr chunk (state=3): >>><<< 12081 1726882396.83934: stdout chunk (state=3): >>><<< 12081 1726882396.84043: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882396.84047: _low_level_execute_command(): starting 12081 1726882396.84049: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362/AnsiballZ_network_connections.py && sleep 0' 12081 1726882396.84665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882396.84680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.84703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.84720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.84762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.84776: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882396.84789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.84814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882396.84825: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882396.84835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882396.84845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882396.84857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882396.84874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882396.84884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882396.84894: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882396.84910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882396.84992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882396.85016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882396.85038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882396.85181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.23141: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12081 1726882397.24872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882397.24928: stderr chunk (state=3): >>><<< 12081 1726882397.24931: stdout chunk (state=3): >>><<< 12081 1726882397.24948: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882397.25009: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': '802.3ad', 'ad_actor_sys_prio': 65535, 'ad_actor_system': '00:00:5e:00:53:5d', 'ad_select': 'stable', 'ad_user_port_key': 1023, 'all_ports_active': True, 'downdelay': 0, 'lacp_rate': 'slow', 'lp_interval': 128, 'miimon': 110, 'min_links': 0, 'num_grat_arp': 64, 'primary_reselect': 'better', 'resend_igmp': 225, 'updelay': 0, 'use_carrier': True, 'xmit_hash_policy': 'encap2+3'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882397.25018: _low_level_execute_command(): starting 12081 1726882397.25022: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882396.456058-12810-9204074884362/ > /dev/null 2>&1 && sleep 0' 12081 1726882397.25467: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882397.25475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.25484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.25494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.25541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.25544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.25546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.25604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882397.25608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882397.25614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882397.25712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.27811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882397.27855: stderr chunk (state=3): >>><<< 12081 1726882397.27858: stdout chunk (state=3): >>><<< 12081 1726882397.27872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882397.27878: handler run complete 12081 1726882397.27912: attempt loop complete, returning result 12081 1726882397.27916: _execute() done 12081 1726882397.27919: dumping result to json 12081 1726882397.27930: done dumping result, returning 12081 1726882397.27938: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-0a3f-ff3c-000000000287] 12081 1726882397.27943: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000287 12081 1726882397.28069: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000287 12081 1726882397.28071: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active) 12081 1726882397.28229: no more pending results, returning what we have 12081 1726882397.28232: results queue empty 12081 1726882397.28233: checking for any_errors_fatal 12081 1726882397.28240: done checking for any_errors_fatal 12081 1726882397.28241: checking for max_fail_percentage 12081 1726882397.28243: done checking for max_fail_percentage 12081 1726882397.28243: checking to see if all hosts have failed and the running result is not ok 12081 1726882397.28244: done checking to see if all hosts have failed 12081 1726882397.28245: getting the remaining hosts for this loop 12081 1726882397.28246: done getting the remaining hosts for this loop 12081 1726882397.28250: getting the next task for host managed_node3 12081 1726882397.28258: done getting next task for host managed_node3 12081 1726882397.28262: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882397.28273: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882397.28285: getting variables 12081 1726882397.28286: in VariableManager get_vars() 12081 1726882397.28317: Calling all_inventory to load vars for managed_node3 12081 1726882397.28320: Calling groups_inventory to load vars for managed_node3 12081 1726882397.28322: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882397.28331: Calling all_plugins_play to load vars for managed_node3 12081 1726882397.28334: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882397.28336: Calling groups_plugins_play to load vars for managed_node3 12081 1726882397.29271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882397.30199: done with get_vars() 12081 1726882397.30215: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:17 -0400 (0:00:01.044) 0:00:17.105 ****** 12081 1726882397.30281: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882397.30282: Creating lock for fedora.linux_system_roles.network_state 12081 1726882397.30499: worker is 1 (out of 1 available) 12081 1726882397.30511: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882397.30523: done queuing things up, now waiting for results queue to drain 12081 1726882397.30525: waiting for pending results... 12081 1726882397.30698: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882397.30793: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000288 12081 1726882397.30805: variable 'ansible_search_path' from source: unknown 12081 1726882397.30809: variable 'ansible_search_path' from source: unknown 12081 1726882397.30837: calling self._execute() 12081 1726882397.30903: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.30907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.30915: variable 'omit' from source: magic vars 12081 1726882397.31188: variable 'ansible_distribution_major_version' from source: facts 12081 1726882397.31207: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882397.31291: variable 'network_state' from source: role '' defaults 12081 1726882397.31303: Evaluated conditional (network_state != {}): False 12081 1726882397.31307: when evaluation is False, skipping this task 12081 1726882397.31310: _execute() done 12081 1726882397.31312: dumping result to json 12081 1726882397.31314: done dumping result, returning 12081 1726882397.31323: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-0a3f-ff3c-000000000288] 12081 1726882397.31330: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000288 12081 1726882397.31416: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000288 12081 1726882397.31418: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882397.31473: no more pending results, returning what we have 12081 1726882397.31477: results queue empty 12081 1726882397.31478: checking for any_errors_fatal 12081 1726882397.31491: done checking for any_errors_fatal 12081 1726882397.31492: checking for max_fail_percentage 12081 1726882397.31493: done checking for max_fail_percentage 12081 1726882397.31494: checking to see if all hosts have failed and the running result is not ok 12081 1726882397.31495: done checking to see if all hosts have failed 12081 1726882397.31496: getting the remaining hosts for this loop 12081 1726882397.31497: done getting the remaining hosts for this loop 12081 1726882397.31501: getting the next task for host managed_node3 12081 1726882397.31507: done getting next task for host managed_node3 12081 1726882397.31510: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882397.31515: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882397.31532: getting variables 12081 1726882397.31534: in VariableManager get_vars() 12081 1726882397.31562: Calling all_inventory to load vars for managed_node3 12081 1726882397.31567: Calling groups_inventory to load vars for managed_node3 12081 1726882397.31570: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882397.31578: Calling all_plugins_play to load vars for managed_node3 12081 1726882397.31580: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882397.31582: Calling groups_plugins_play to load vars for managed_node3 12081 1726882397.32342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882397.33277: done with get_vars() 12081 1726882397.33298: done getting variables 12081 1726882397.33340: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:17 -0400 (0:00:00.030) 0:00:17.136 ****** 12081 1726882397.33368: entering _queue_task() for managed_node3/debug 12081 1726882397.33601: worker is 1 (out of 1 available) 12081 1726882397.33617: exiting _queue_task() for managed_node3/debug 12081 1726882397.33632: done queuing things up, now waiting for results queue to drain 12081 1726882397.33633: waiting for pending results... 12081 1726882397.33819: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882397.33917: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000289 12081 1726882397.33929: variable 'ansible_search_path' from source: unknown 12081 1726882397.33933: variable 'ansible_search_path' from source: unknown 12081 1726882397.33966: calling self._execute() 12081 1726882397.34042: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.34047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.34055: variable 'omit' from source: magic vars 12081 1726882397.34330: variable 'ansible_distribution_major_version' from source: facts 12081 1726882397.34341: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882397.34349: variable 'omit' from source: magic vars 12081 1726882397.34395: variable 'omit' from source: magic vars 12081 1726882397.34418: variable 'omit' from source: magic vars 12081 1726882397.34455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882397.34482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882397.34497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882397.34510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882397.34522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882397.34545: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882397.34548: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.34550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.34625: Set connection var ansible_pipelining to False 12081 1726882397.34628: Set connection var ansible_shell_type to sh 12081 1726882397.34631: Set connection var ansible_shell_executable to /bin/sh 12081 1726882397.34633: Set connection var ansible_connection to ssh 12081 1726882397.34640: Set connection var ansible_timeout to 10 12081 1726882397.34645: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882397.34667: variable 'ansible_shell_executable' from source: unknown 12081 1726882397.34670: variable 'ansible_connection' from source: unknown 12081 1726882397.34673: variable 'ansible_module_compression' from source: unknown 12081 1726882397.34677: variable 'ansible_shell_type' from source: unknown 12081 1726882397.34679: variable 'ansible_shell_executable' from source: unknown 12081 1726882397.34681: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.34683: variable 'ansible_pipelining' from source: unknown 12081 1726882397.34687: variable 'ansible_timeout' from source: unknown 12081 1726882397.34689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.34790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882397.34799: variable 'omit' from source: magic vars 12081 1726882397.34802: starting attempt loop 12081 1726882397.34805: running the handler 12081 1726882397.34903: variable '__network_connections_result' from source: set_fact 12081 1726882397.34958: handler run complete 12081 1726882397.34975: attempt loop complete, returning result 12081 1726882397.34979: _execute() done 12081 1726882397.34981: dumping result to json 12081 1726882397.34983: done dumping result, returning 12081 1726882397.34990: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-000000000289] 12081 1726882397.34997: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000289 12081 1726882397.35080: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000289 12081 1726882397.35082: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active)" ] } 12081 1726882397.35148: no more pending results, returning what we have 12081 1726882397.35153: results queue empty 12081 1726882397.35154: checking for any_errors_fatal 12081 1726882397.35165: done checking for any_errors_fatal 12081 1726882397.35166: checking for max_fail_percentage 12081 1726882397.35168: done checking for max_fail_percentage 12081 1726882397.35169: checking to see if all hosts have failed and the running result is not ok 12081 1726882397.35170: done checking to see if all hosts have failed 12081 1726882397.35170: getting the remaining hosts for this loop 12081 1726882397.35172: done getting the remaining hosts for this loop 12081 1726882397.35176: getting the next task for host managed_node3 12081 1726882397.35183: done getting next task for host managed_node3 12081 1726882397.35186: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882397.35197: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882397.35208: getting variables 12081 1726882397.35210: in VariableManager get_vars() 12081 1726882397.35238: Calling all_inventory to load vars for managed_node3 12081 1726882397.35241: Calling groups_inventory to load vars for managed_node3 12081 1726882397.35243: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882397.35254: Calling all_plugins_play to load vars for managed_node3 12081 1726882397.35261: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882397.35267: Calling groups_plugins_play to load vars for managed_node3 12081 1726882397.36178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882397.37109: done with get_vars() 12081 1726882397.37127: done getting variables 12081 1726882397.37178: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:17 -0400 (0:00:00.038) 0:00:17.174 ****** 12081 1726882397.37201: entering _queue_task() for managed_node3/debug 12081 1726882397.37434: worker is 1 (out of 1 available) 12081 1726882397.37448: exiting _queue_task() for managed_node3/debug 12081 1726882397.37466: done queuing things up, now waiting for results queue to drain 12081 1726882397.37468: waiting for pending results... 12081 1726882397.37644: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882397.37730: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000028a 12081 1726882397.37743: variable 'ansible_search_path' from source: unknown 12081 1726882397.37746: variable 'ansible_search_path' from source: unknown 12081 1726882397.37781: calling self._execute() 12081 1726882397.37855: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.37859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.37869: variable 'omit' from source: magic vars 12081 1726882397.38153: variable 'ansible_distribution_major_version' from source: facts 12081 1726882397.38164: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882397.38170: variable 'omit' from source: magic vars 12081 1726882397.38210: variable 'omit' from source: magic vars 12081 1726882397.38237: variable 'omit' from source: magic vars 12081 1726882397.38271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882397.38296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882397.38312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882397.38325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882397.38335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882397.38361: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882397.38367: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.38369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.38436: Set connection var ansible_pipelining to False 12081 1726882397.38439: Set connection var ansible_shell_type to sh 12081 1726882397.38445: Set connection var ansible_shell_executable to /bin/sh 12081 1726882397.38448: Set connection var ansible_connection to ssh 12081 1726882397.38452: Set connection var ansible_timeout to 10 12081 1726882397.38463: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882397.38486: variable 'ansible_shell_executable' from source: unknown 12081 1726882397.38490: variable 'ansible_connection' from source: unknown 12081 1726882397.38493: variable 'ansible_module_compression' from source: unknown 12081 1726882397.38495: variable 'ansible_shell_type' from source: unknown 12081 1726882397.38497: variable 'ansible_shell_executable' from source: unknown 12081 1726882397.38499: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.38501: variable 'ansible_pipelining' from source: unknown 12081 1726882397.38503: variable 'ansible_timeout' from source: unknown 12081 1726882397.38508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.38609: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882397.38617: variable 'omit' from source: magic vars 12081 1726882397.38622: starting attempt loop 12081 1726882397.38625: running the handler 12081 1726882397.38670: variable '__network_connections_result' from source: set_fact 12081 1726882397.38725: variable '__network_connections_result' from source: set_fact 12081 1726882397.38866: handler run complete 12081 1726882397.38892: attempt loop complete, returning result 12081 1726882397.38896: _execute() done 12081 1726882397.38899: dumping result to json 12081 1726882397.38901: done dumping result, returning 12081 1726882397.38911: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-00000000028a] 12081 1726882397.38917: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000028a 12081 1726882397.39029: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000028a 12081 1726882397.39032: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active)" ] } } 12081 1726882397.39159: no more pending results, returning what we have 12081 1726882397.39163: results queue empty 12081 1726882397.39166: checking for any_errors_fatal 12081 1726882397.39170: done checking for any_errors_fatal 12081 1726882397.39171: checking for max_fail_percentage 12081 1726882397.39172: done checking for max_fail_percentage 12081 1726882397.39173: checking to see if all hosts have failed and the running result is not ok 12081 1726882397.39174: done checking to see if all hosts have failed 12081 1726882397.39174: getting the remaining hosts for this loop 12081 1726882397.39176: done getting the remaining hosts for this loop 12081 1726882397.39179: getting the next task for host managed_node3 12081 1726882397.39185: done getting next task for host managed_node3 12081 1726882397.39188: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882397.39193: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882397.39202: getting variables 12081 1726882397.39203: in VariableManager get_vars() 12081 1726882397.39232: Calling all_inventory to load vars for managed_node3 12081 1726882397.39234: Calling groups_inventory to load vars for managed_node3 12081 1726882397.39236: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882397.39242: Calling all_plugins_play to load vars for managed_node3 12081 1726882397.39244: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882397.39246: Calling groups_plugins_play to load vars for managed_node3 12081 1726882397.40039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882397.41449: done with get_vars() 12081 1726882397.41478: done getting variables 12081 1726882397.41539: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:17 -0400 (0:00:00.043) 0:00:17.218 ****** 12081 1726882397.41580: entering _queue_task() for managed_node3/debug 12081 1726882397.41904: worker is 1 (out of 1 available) 12081 1726882397.41930: exiting _queue_task() for managed_node3/debug 12081 1726882397.41943: done queuing things up, now waiting for results queue to drain 12081 1726882397.41945: waiting for pending results... 12081 1726882397.42426: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882397.42581: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000028b 12081 1726882397.42649: variable 'ansible_search_path' from source: unknown 12081 1726882397.42663: variable 'ansible_search_path' from source: unknown 12081 1726882397.42708: calling self._execute() 12081 1726882397.42805: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.42816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.42833: variable 'omit' from source: magic vars 12081 1726882397.43212: variable 'ansible_distribution_major_version' from source: facts 12081 1726882397.43231: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882397.43367: variable 'network_state' from source: role '' defaults 12081 1726882397.43388: Evaluated conditional (network_state != {}): False 12081 1726882397.43396: when evaluation is False, skipping this task 12081 1726882397.43403: _execute() done 12081 1726882397.43410: dumping result to json 12081 1726882397.43417: done dumping result, returning 12081 1726882397.43428: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-0a3f-ff3c-00000000028b] 12081 1726882397.43441: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000028b skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12081 1726882397.43598: no more pending results, returning what we have 12081 1726882397.43603: results queue empty 12081 1726882397.43603: checking for any_errors_fatal 12081 1726882397.43618: done checking for any_errors_fatal 12081 1726882397.43619: checking for max_fail_percentage 12081 1726882397.43621: done checking for max_fail_percentage 12081 1726882397.43622: checking to see if all hosts have failed and the running result is not ok 12081 1726882397.43623: done checking to see if all hosts have failed 12081 1726882397.43624: getting the remaining hosts for this loop 12081 1726882397.43626: done getting the remaining hosts for this loop 12081 1726882397.43630: getting the next task for host managed_node3 12081 1726882397.43640: done getting next task for host managed_node3 12081 1726882397.43644: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882397.43654: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882397.43672: getting variables 12081 1726882397.43674: in VariableManager get_vars() 12081 1726882397.43712: Calling all_inventory to load vars for managed_node3 12081 1726882397.43715: Calling groups_inventory to load vars for managed_node3 12081 1726882397.43718: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882397.43730: Calling all_plugins_play to load vars for managed_node3 12081 1726882397.43733: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882397.43736: Calling groups_plugins_play to load vars for managed_node3 12081 1726882397.44684: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000028b 12081 1726882397.44688: WORKER PROCESS EXITING 12081 1726882397.45371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882397.46313: done with get_vars() 12081 1726882397.46332: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:17 -0400 (0:00:00.048) 0:00:17.267 ****** 12081 1726882397.46410: entering _queue_task() for managed_node3/ping 12081 1726882397.46411: Creating lock for ping 12081 1726882397.46654: worker is 1 (out of 1 available) 12081 1726882397.46668: exiting _queue_task() for managed_node3/ping 12081 1726882397.46682: done queuing things up, now waiting for results queue to drain 12081 1726882397.46683: waiting for pending results... 12081 1726882397.46879: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882397.47002: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000028c 12081 1726882397.47014: variable 'ansible_search_path' from source: unknown 12081 1726882397.47017: variable 'ansible_search_path' from source: unknown 12081 1726882397.47050: calling self._execute() 12081 1726882397.47138: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.47143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.47154: variable 'omit' from source: magic vars 12081 1726882397.47467: variable 'ansible_distribution_major_version' from source: facts 12081 1726882397.47490: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882397.47494: variable 'omit' from source: magic vars 12081 1726882397.47541: variable 'omit' from source: magic vars 12081 1726882397.47674: variable 'omit' from source: magic vars 12081 1726882397.47680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882397.47684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882397.47687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882397.47690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882397.47707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882397.47738: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882397.47746: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.47760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.47877: Set connection var ansible_pipelining to False 12081 1726882397.47886: Set connection var ansible_shell_type to sh 12081 1726882397.47898: Set connection var ansible_shell_executable to /bin/sh 12081 1726882397.47904: Set connection var ansible_connection to ssh 12081 1726882397.47919: Set connection var ansible_timeout to 10 12081 1726882397.47927: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882397.47959: variable 'ansible_shell_executable' from source: unknown 12081 1726882397.47974: variable 'ansible_connection' from source: unknown 12081 1726882397.47982: variable 'ansible_module_compression' from source: unknown 12081 1726882397.47989: variable 'ansible_shell_type' from source: unknown 12081 1726882397.47996: variable 'ansible_shell_executable' from source: unknown 12081 1726882397.48003: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882397.48010: variable 'ansible_pipelining' from source: unknown 12081 1726882397.48017: variable 'ansible_timeout' from source: unknown 12081 1726882397.48027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882397.48245: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882397.48266: variable 'omit' from source: magic vars 12081 1726882397.48276: starting attempt loop 12081 1726882397.48282: running the handler 12081 1726882397.48304: _low_level_execute_command(): starting 12081 1726882397.48315: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882397.48950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.48974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.48989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.49005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.49043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882397.49061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882397.49167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.50795: stdout chunk (state=3): >>>/root <<< 12081 1726882397.50904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882397.50983: stderr chunk (state=3): >>><<< 12081 1726882397.50991: stdout chunk (state=3): >>><<< 12081 1726882397.51025: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882397.51045: _low_level_execute_command(): starting 12081 1726882397.51057: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436 `" && echo ansible-tmp-1726882397.5103035-12850-9394982271436="` echo /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436 `" ) && sleep 0' 12081 1726882397.51726: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882397.51741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.51758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.51787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.51830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.51844: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882397.51872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.51898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882397.51914: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882397.51926: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882397.51940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.51956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.51975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.51992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.52008: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882397.52028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.52115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882397.52143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882397.52163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882397.52297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.54205: stdout chunk (state=3): >>>ansible-tmp-1726882397.5103035-12850-9394982271436=/root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436 <<< 12081 1726882397.54401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882397.54405: stdout chunk (state=3): >>><<< 12081 1726882397.54407: stderr chunk (state=3): >>><<< 12081 1726882397.54654: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882397.5103035-12850-9394982271436=/root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882397.54661: variable 'ansible_module_compression' from source: unknown 12081 1726882397.54665: ANSIBALLZ: Using lock for ping 12081 1726882397.54668: ANSIBALLZ: Acquiring lock 12081 1726882397.54670: ANSIBALLZ: Lock acquired: 139893496635632 12081 1726882397.54672: ANSIBALLZ: Creating module 12081 1726882397.67586: ANSIBALLZ: Writing module into payload 12081 1726882397.67666: ANSIBALLZ: Writing module 12081 1726882397.67706: ANSIBALLZ: Renaming module 12081 1726882397.67719: ANSIBALLZ: Done creating module 12081 1726882397.67741: variable 'ansible_facts' from source: unknown 12081 1726882397.67822: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436/AnsiballZ_ping.py 12081 1726882397.67993: Sending initial data 12081 1726882397.67996: Sent initial data (151 bytes) 12081 1726882397.69078: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882397.69098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.69120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.69140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.69189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.69207: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882397.69222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.69245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882397.69261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882397.69277: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882397.69290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.69308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.69325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.69348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.69367: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882397.69404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.69488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882397.69509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882397.69528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882397.69675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.71492: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882397.71588: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882397.71695: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmptirhd43f /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436/AnsiballZ_ping.py <<< 12081 1726882397.71803: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882397.72975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882397.73146: stderr chunk (state=3): >>><<< 12081 1726882397.73149: stdout chunk (state=3): >>><<< 12081 1726882397.73154: done transferring module to remote 12081 1726882397.73156: _low_level_execute_command(): starting 12081 1726882397.73158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436/ /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436/AnsiballZ_ping.py && sleep 0' 12081 1726882397.73778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882397.73802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.73819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.73842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.73893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.73913: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882397.73932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.73950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882397.73968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882397.74377: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882397.74890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.74904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.74918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.74929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.74939: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882397.74956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.75040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882397.75238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882397.75256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882397.75395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.77253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882397.77258: stdout chunk (state=3): >>><<< 12081 1726882397.77260: stderr chunk (state=3): >>><<< 12081 1726882397.77286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882397.77289: _low_level_execute_command(): starting 12081 1726882397.77293: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436/AnsiballZ_ping.py && sleep 0' 12081 1726882397.77911: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882397.77921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.77940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.77957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.77998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.78006: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882397.78018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.78032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882397.78048: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882397.78056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882397.78063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.78078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.78091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.78100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.78107: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882397.78118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.78197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882397.78216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882397.78228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882397.78443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.91150: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12081 1726882397.92203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882397.92207: stdout chunk (state=3): >>><<< 12081 1726882397.92209: stderr chunk (state=3): >>><<< 12081 1726882397.92272: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882397.92278: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882397.92285: _low_level_execute_command(): starting 12081 1726882397.92287: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882397.5103035-12850-9394982271436/ > /dev/null 2>&1 && sleep 0' 12081 1726882397.93028: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882397.93046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.93080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.93086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.93126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882397.93130: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882397.93142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.93154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882397.93157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882397.93168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882397.93177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882397.93183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882397.93190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882397.93276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882397.93279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882397.93303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882397.93427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882397.95283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882397.95287: stderr chunk (state=3): >>><<< 12081 1726882397.95290: stdout chunk (state=3): >>><<< 12081 1726882397.95305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882397.95311: handler run complete 12081 1726882397.95323: attempt loop complete, returning result 12081 1726882397.95326: _execute() done 12081 1726882397.95328: dumping result to json 12081 1726882397.95330: done dumping result, returning 12081 1726882397.95339: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-0a3f-ff3c-00000000028c] 12081 1726882397.95345: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000028c 12081 1726882397.95436: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000028c 12081 1726882397.95439: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 12081 1726882397.95498: no more pending results, returning what we have 12081 1726882397.95501: results queue empty 12081 1726882397.95502: checking for any_errors_fatal 12081 1726882397.95508: done checking for any_errors_fatal 12081 1726882397.95509: checking for max_fail_percentage 12081 1726882397.95510: done checking for max_fail_percentage 12081 1726882397.95511: checking to see if all hosts have failed and the running result is not ok 12081 1726882397.95512: done checking to see if all hosts have failed 12081 1726882397.95513: getting the remaining hosts for this loop 12081 1726882397.95515: done getting the remaining hosts for this loop 12081 1726882397.95518: getting the next task for host managed_node3 12081 1726882397.95528: done getting next task for host managed_node3 12081 1726882397.95530: ^ task is: TASK: meta (role_complete) 12081 1726882397.95535: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882397.95544: getting variables 12081 1726882397.95546: in VariableManager get_vars() 12081 1726882397.95588: Calling all_inventory to load vars for managed_node3 12081 1726882397.95591: Calling groups_inventory to load vars for managed_node3 12081 1726882397.95593: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882397.95604: Calling all_plugins_play to load vars for managed_node3 12081 1726882397.95606: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882397.95609: Calling groups_plugins_play to load vars for managed_node3 12081 1726882397.96701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882397.99331: done with get_vars() 12081 1726882397.99383: done getting variables 12081 1726882397.99488: done queuing things up, now waiting for results queue to drain 12081 1726882397.99490: results queue empty 12081 1726882397.99491: checking for any_errors_fatal 12081 1726882397.99495: done checking for any_errors_fatal 12081 1726882397.99495: checking for max_fail_percentage 12081 1726882397.99496: done checking for max_fail_percentage 12081 1726882397.99497: checking to see if all hosts have failed and the running result is not ok 12081 1726882397.99498: done checking to see if all hosts have failed 12081 1726882397.99499: getting the remaining hosts for this loop 12081 1726882397.99499: done getting the remaining hosts for this loop 12081 1726882397.99503: getting the next task for host managed_node3 12081 1726882397.99507: done getting next task for host managed_node3 12081 1726882397.99509: ^ task is: TASK: Show result 12081 1726882397.99516: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882397.99519: getting variables 12081 1726882397.99520: in VariableManager get_vars() 12081 1726882397.99530: Calling all_inventory to load vars for managed_node3 12081 1726882397.99532: Calling groups_inventory to load vars for managed_node3 12081 1726882397.99535: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882397.99540: Calling all_plugins_play to load vars for managed_node3 12081 1726882397.99542: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882397.99545: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.00539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.01484: done with get_vars() 12081 1726882398.01503: done getting variables 12081 1726882398.01537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:46 Friday 20 September 2024 21:33:18 -0400 (0:00:00.551) 0:00:17.818 ****** 12081 1726882398.01561: entering _queue_task() for managed_node3/debug 12081 1726882398.01937: worker is 1 (out of 1 available) 12081 1726882398.01948: exiting _queue_task() for managed_node3/debug 12081 1726882398.01966: done queuing things up, now waiting for results queue to drain 12081 1726882398.01968: waiting for pending results... 12081 1726882398.02114: running TaskExecutor() for managed_node3/TASK: Show result 12081 1726882398.02253: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000001c6 12081 1726882398.02279: variable 'ansible_search_path' from source: unknown 12081 1726882398.02293: variable 'ansible_search_path' from source: unknown 12081 1726882398.02342: calling self._execute() 12081 1726882398.02444: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.02457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.02473: variable 'omit' from source: magic vars 12081 1726882398.02888: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.02899: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.02905: variable 'omit' from source: magic vars 12081 1726882398.02936: variable 'omit' from source: magic vars 12081 1726882398.02984: variable 'omit' from source: magic vars 12081 1726882398.03018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882398.03042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882398.03069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882398.03084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.03094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.03120: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882398.03129: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.03131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.03204: Set connection var ansible_pipelining to False 12081 1726882398.03207: Set connection var ansible_shell_type to sh 12081 1726882398.03212: Set connection var ansible_shell_executable to /bin/sh 12081 1726882398.03215: Set connection var ansible_connection to ssh 12081 1726882398.03220: Set connection var ansible_timeout to 10 12081 1726882398.03225: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882398.03246: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.03248: variable 'ansible_connection' from source: unknown 12081 1726882398.03254: variable 'ansible_module_compression' from source: unknown 12081 1726882398.03256: variable 'ansible_shell_type' from source: unknown 12081 1726882398.03258: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.03260: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.03264: variable 'ansible_pipelining' from source: unknown 12081 1726882398.03266: variable 'ansible_timeout' from source: unknown 12081 1726882398.03268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.03370: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882398.03381: variable 'omit' from source: magic vars 12081 1726882398.03384: starting attempt loop 12081 1726882398.03387: running the handler 12081 1726882398.03427: variable '__network_connections_result' from source: set_fact 12081 1726882398.03484: variable '__network_connections_result' from source: set_fact 12081 1726882398.03623: handler run complete 12081 1726882398.03648: attempt loop complete, returning result 12081 1726882398.03653: _execute() done 12081 1726882398.03656: dumping result to json 12081 1726882398.03659: done dumping result, returning 12081 1726882398.03668: done running TaskExecutor() for managed_node3/TASK: Show result [0e448fcc-3ce9-0a3f-ff3c-0000000001c6] 12081 1726882398.03675: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000001c6 12081 1726882398.03776: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000001c6 12081 1726882398.03779: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 26d4f563-848c-497d-a265-cdc5607d566a (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, ac020b9c-b6bb-4116-8ba6-0fb911e93ae5 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f1f3e927-6a4a-4c04-ae47-d49ad5d71408 (not-active)" ] } } 12081 1726882398.03891: no more pending results, returning what we have 12081 1726882398.03894: results queue empty 12081 1726882398.03895: checking for any_errors_fatal 12081 1726882398.03897: done checking for any_errors_fatal 12081 1726882398.03897: checking for max_fail_percentage 12081 1726882398.03899: done checking for max_fail_percentage 12081 1726882398.03900: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.03901: done checking to see if all hosts have failed 12081 1726882398.03901: getting the remaining hosts for this loop 12081 1726882398.03903: done getting the remaining hosts for this loop 12081 1726882398.03906: getting the next task for host managed_node3 12081 1726882398.03914: done getting next task for host managed_node3 12081 1726882398.03917: ^ task is: TASK: Asserts 12081 1726882398.03920: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.03924: getting variables 12081 1726882398.03926: in VariableManager get_vars() 12081 1726882398.03953: Calling all_inventory to load vars for managed_node3 12081 1726882398.03956: Calling groups_inventory to load vars for managed_node3 12081 1726882398.03958: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.03969: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.03972: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.03976: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.04890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.06589: done with get_vars() 12081 1726882398.06620: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:33:18 -0400 (0:00:00.051) 0:00:17.870 ****** 12081 1726882398.06733: entering _queue_task() for managed_node3/include_tasks 12081 1726882398.07095: worker is 1 (out of 1 available) 12081 1726882398.07110: exiting _queue_task() for managed_node3/include_tasks 12081 1726882398.07123: done queuing things up, now waiting for results queue to drain 12081 1726882398.07124: waiting for pending results... 12081 1726882398.07420: running TaskExecutor() for managed_node3/TASK: Asserts 12081 1726882398.07554: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000008d 12081 1726882398.07580: variable 'ansible_search_path' from source: unknown 12081 1726882398.07587: variable 'ansible_search_path' from source: unknown 12081 1726882398.07637: variable 'lsr_assert' from source: include params 12081 1726882398.07854: variable 'lsr_assert' from source: include params 12081 1726882398.07939: variable 'omit' from source: magic vars 12081 1726882398.08086: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.08101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.08119: variable 'omit' from source: magic vars 12081 1726882398.08379: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.08397: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.08407: variable 'item' from source: unknown 12081 1726882398.08487: variable 'item' from source: unknown 12081 1726882398.08528: variable 'item' from source: unknown 12081 1726882398.08606: variable 'item' from source: unknown 12081 1726882398.08835: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.08850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.08870: variable 'omit' from source: magic vars 12081 1726882398.09049: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.09066: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.09076: variable 'item' from source: unknown 12081 1726882398.09147: variable 'item' from source: unknown 12081 1726882398.09188: variable 'item' from source: unknown 12081 1726882398.09261: variable 'item' from source: unknown 12081 1726882398.09390: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.09403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.09416: variable 'omit' from source: magic vars 12081 1726882398.09563: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.09575: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.09583: variable 'item' from source: unknown 12081 1726882398.09638: variable 'item' from source: unknown 12081 1726882398.09678: variable 'item' from source: unknown 12081 1726882398.09735: variable 'item' from source: unknown 12081 1726882398.09811: dumping result to json 12081 1726882398.09821: done dumping result, returning 12081 1726882398.09831: done running TaskExecutor() for managed_node3/TASK: Asserts [0e448fcc-3ce9-0a3f-ff3c-00000000008d] 12081 1726882398.09843: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008d 12081 1726882398.09933: no more pending results, returning what we have 12081 1726882398.09939: in VariableManager get_vars() 12081 1726882398.09981: Calling all_inventory to load vars for managed_node3 12081 1726882398.09984: Calling groups_inventory to load vars for managed_node3 12081 1726882398.09988: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.10005: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.10009: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.10012: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.11083: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008d 12081 1726882398.11087: WORKER PROCESS EXITING 12081 1726882398.11784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.13644: done with get_vars() 12081 1726882398.13673: variable 'ansible_search_path' from source: unknown 12081 1726882398.13674: variable 'ansible_search_path' from source: unknown 12081 1726882398.13719: variable 'ansible_search_path' from source: unknown 12081 1726882398.13720: variable 'ansible_search_path' from source: unknown 12081 1726882398.13754: variable 'ansible_search_path' from source: unknown 12081 1726882398.13756: variable 'ansible_search_path' from source: unknown 12081 1726882398.13785: we have included files to process 12081 1726882398.13786: generating all_blocks data 12081 1726882398.13788: done generating all_blocks data 12081 1726882398.13801: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 12081 1726882398.13803: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 12081 1726882398.13805: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 12081 1726882398.13968: in VariableManager get_vars() 12081 1726882398.13989: done with get_vars() 12081 1726882398.13996: variable 'item' from source: include params 12081 1726882398.14104: variable 'item' from source: include params 12081 1726882398.14136: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12081 1726882398.14223: in VariableManager get_vars() 12081 1726882398.14244: done with get_vars() 12081 1726882398.14375: done processing included file 12081 1726882398.14377: iterating over new_blocks loaded from include file 12081 1726882398.14379: in VariableManager get_vars() 12081 1726882398.14392: done with get_vars() 12081 1726882398.14394: filtering new block on tags 12081 1726882398.14442: done filtering new block on tags 12081 1726882398.14446: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml for managed_node3 => (item=tasks/assert_controller_device_present.yml) 12081 1726882398.14453: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 12081 1726882398.14455: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 12081 1726882398.14457: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 12081 1726882398.14587: in VariableManager get_vars() 12081 1726882398.14604: done with get_vars() 12081 1726882398.14616: done processing included file 12081 1726882398.14618: iterating over new_blocks loaded from include file 12081 1726882398.14619: in VariableManager get_vars() 12081 1726882398.14632: done with get_vars() 12081 1726882398.14633: filtering new block on tags 12081 1726882398.14657: done filtering new block on tags 12081 1726882398.14659: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml for managed_node3 => (item=tasks/assert_bond_port_profile_present.yml) 12081 1726882398.14665: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12081 1726882398.14666: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12081 1726882398.14674: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12081 1726882398.14988: in VariableManager get_vars() 12081 1726882398.15006: done with get_vars() 12081 1726882398.15044: in VariableManager get_vars() 12081 1726882398.15066: done with get_vars() 12081 1726882398.15079: done processing included file 12081 1726882398.15080: iterating over new_blocks loaded from include file 12081 1726882398.15082: in VariableManager get_vars() 12081 1726882398.15094: done with get_vars() 12081 1726882398.15096: filtering new block on tags 12081 1726882398.15133: done filtering new block on tags 12081 1726882398.15135: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node3 => (item=tasks/assert_bond_options.yml) 12081 1726882398.15139: extending task lists for all hosts with included blocks 12081 1726882398.16643: done extending task lists 12081 1726882398.16645: done processing included files 12081 1726882398.16646: results queue empty 12081 1726882398.16646: checking for any_errors_fatal 12081 1726882398.16655: done checking for any_errors_fatal 12081 1726882398.16656: checking for max_fail_percentage 12081 1726882398.16658: done checking for max_fail_percentage 12081 1726882398.16659: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.16661: done checking to see if all hosts have failed 12081 1726882398.16662: getting the remaining hosts for this loop 12081 1726882398.16664: done getting the remaining hosts for this loop 12081 1726882398.16667: getting the next task for host managed_node3 12081 1726882398.16672: done getting next task for host managed_node3 12081 1726882398.16674: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12081 1726882398.16677: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.16680: getting variables 12081 1726882398.16680: in VariableManager get_vars() 12081 1726882398.16689: Calling all_inventory to load vars for managed_node3 12081 1726882398.16691: Calling groups_inventory to load vars for managed_node3 12081 1726882398.16694: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.16700: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.16702: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.16705: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.17986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.19701: done with get_vars() 12081 1726882398.19729: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:18 -0400 (0:00:00.130) 0:00:18.001 ****** 12081 1726882398.19812: entering _queue_task() for managed_node3/include_tasks 12081 1726882398.20147: worker is 1 (out of 1 available) 12081 1726882398.20160: exiting _queue_task() for managed_node3/include_tasks 12081 1726882398.20174: done queuing things up, now waiting for results queue to drain 12081 1726882398.20175: waiting for pending results... 12081 1726882398.20466: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12081 1726882398.20605: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000003f5 12081 1726882398.20626: variable 'ansible_search_path' from source: unknown 12081 1726882398.20634: variable 'ansible_search_path' from source: unknown 12081 1726882398.20679: calling self._execute() 12081 1726882398.20773: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.20784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.20797: variable 'omit' from source: magic vars 12081 1726882398.21178: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.21199: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.21209: _execute() done 12081 1726882398.21216: dumping result to json 12081 1726882398.21222: done dumping result, returning 12081 1726882398.21233: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-0000000003f5] 12081 1726882398.21245: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000003f5 12081 1726882398.21365: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000003f5 12081 1726882398.21372: WORKER PROCESS EXITING 12081 1726882398.21401: no more pending results, returning what we have 12081 1726882398.21407: in VariableManager get_vars() 12081 1726882398.21444: Calling all_inventory to load vars for managed_node3 12081 1726882398.21446: Calling groups_inventory to load vars for managed_node3 12081 1726882398.21453: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.21470: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.21473: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.21476: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.26922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.27865: done with get_vars() 12081 1726882398.27881: variable 'ansible_search_path' from source: unknown 12081 1726882398.27882: variable 'ansible_search_path' from source: unknown 12081 1726882398.27907: we have included files to process 12081 1726882398.27907: generating all_blocks data 12081 1726882398.27908: done generating all_blocks data 12081 1726882398.27909: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882398.27910: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882398.27911: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882398.28030: done processing included file 12081 1726882398.28031: iterating over new_blocks loaded from include file 12081 1726882398.28032: in VariableManager get_vars() 12081 1726882398.28043: done with get_vars() 12081 1726882398.28044: filtering new block on tags 12081 1726882398.28066: done filtering new block on tags 12081 1726882398.28067: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12081 1726882398.28071: extending task lists for all hosts with included blocks 12081 1726882398.28197: done extending task lists 12081 1726882398.28198: done processing included files 12081 1726882398.28198: results queue empty 12081 1726882398.28199: checking for any_errors_fatal 12081 1726882398.28201: done checking for any_errors_fatal 12081 1726882398.28202: checking for max_fail_percentage 12081 1726882398.28202: done checking for max_fail_percentage 12081 1726882398.28203: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.28203: done checking to see if all hosts have failed 12081 1726882398.28204: getting the remaining hosts for this loop 12081 1726882398.28205: done getting the remaining hosts for this loop 12081 1726882398.28206: getting the next task for host managed_node3 12081 1726882398.28209: done getting next task for host managed_node3 12081 1726882398.28210: ^ task is: TASK: Get stat for interface {{ interface }} 12081 1726882398.28212: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.28214: getting variables 12081 1726882398.28215: in VariableManager get_vars() 12081 1726882398.28221: Calling all_inventory to load vars for managed_node3 12081 1726882398.28222: Calling groups_inventory to load vars for managed_node3 12081 1726882398.28224: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.28227: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.28229: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.28230: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.29046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.30353: done with get_vars() 12081 1726882398.30369: done getting variables 12081 1726882398.30471: variable 'interface' from source: task vars 12081 1726882398.30473: variable 'controller_device' from source: play vars 12081 1726882398.30516: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:18 -0400 (0:00:00.107) 0:00:18.108 ****** 12081 1726882398.30537: entering _queue_task() for managed_node3/stat 12081 1726882398.30783: worker is 1 (out of 1 available) 12081 1726882398.30797: exiting _queue_task() for managed_node3/stat 12081 1726882398.30814: done queuing things up, now waiting for results queue to drain 12081 1726882398.30816: waiting for pending results... 12081 1726882398.30999: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 12081 1726882398.31094: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004af 12081 1726882398.31105: variable 'ansible_search_path' from source: unknown 12081 1726882398.31108: variable 'ansible_search_path' from source: unknown 12081 1726882398.31140: calling self._execute() 12081 1726882398.31214: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.31218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.31227: variable 'omit' from source: magic vars 12081 1726882398.31497: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.31508: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.31514: variable 'omit' from source: magic vars 12081 1726882398.31560: variable 'omit' from source: magic vars 12081 1726882398.31629: variable 'interface' from source: task vars 12081 1726882398.31633: variable 'controller_device' from source: play vars 12081 1726882398.31681: variable 'controller_device' from source: play vars 12081 1726882398.31701: variable 'omit' from source: magic vars 12081 1726882398.31737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882398.31761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882398.31779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882398.31797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.31807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.31831: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882398.31834: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.31836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.31923: Set connection var ansible_pipelining to False 12081 1726882398.31926: Set connection var ansible_shell_type to sh 12081 1726882398.31932: Set connection var ansible_shell_executable to /bin/sh 12081 1726882398.31934: Set connection var ansible_connection to ssh 12081 1726882398.31939: Set connection var ansible_timeout to 10 12081 1726882398.31944: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882398.31966: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.31970: variable 'ansible_connection' from source: unknown 12081 1726882398.31972: variable 'ansible_module_compression' from source: unknown 12081 1726882398.31975: variable 'ansible_shell_type' from source: unknown 12081 1726882398.31978: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.31981: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.31983: variable 'ansible_pipelining' from source: unknown 12081 1726882398.31986: variable 'ansible_timeout' from source: unknown 12081 1726882398.31988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.32142: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882398.32150: variable 'omit' from source: magic vars 12081 1726882398.32158: starting attempt loop 12081 1726882398.32161: running the handler 12081 1726882398.32175: _low_level_execute_command(): starting 12081 1726882398.32181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882398.32712: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882398.32721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882398.32750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.32766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882398.32778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.32822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882398.32835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882398.32950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882398.34627: stdout chunk (state=3): >>>/root <<< 12081 1726882398.34733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882398.34789: stderr chunk (state=3): >>><<< 12081 1726882398.34792: stdout chunk (state=3): >>><<< 12081 1726882398.34818: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882398.34829: _low_level_execute_command(): starting 12081 1726882398.34836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356 `" && echo ansible-tmp-1726882398.3481631-12882-139019393817356="` echo /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356 `" ) && sleep 0' 12081 1726882398.35301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882398.35307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882398.35338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.35360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.35411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882398.35422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882398.35426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882398.35539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882398.37420: stdout chunk (state=3): >>>ansible-tmp-1726882398.3481631-12882-139019393817356=/root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356 <<< 12081 1726882398.37532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882398.37579: stderr chunk (state=3): >>><<< 12081 1726882398.37582: stdout chunk (state=3): >>><<< 12081 1726882398.37600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882398.3481631-12882-139019393817356=/root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882398.37638: variable 'ansible_module_compression' from source: unknown 12081 1726882398.37690: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882398.37725: variable 'ansible_facts' from source: unknown 12081 1726882398.37777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356/AnsiballZ_stat.py 12081 1726882398.37884: Sending initial data 12081 1726882398.37893: Sent initial data (153 bytes) 12081 1726882398.38558: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882398.38562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882398.38615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882398.38618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.38621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882398.38624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882398.38626: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.38676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882398.38680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882398.38786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882398.40508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 12081 1726882398.40512: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882398.40611: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882398.40709: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpai7f19_z /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356/AnsiballZ_stat.py <<< 12081 1726882398.40802: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882398.41815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882398.41920: stderr chunk (state=3): >>><<< 12081 1726882398.41923: stdout chunk (state=3): >>><<< 12081 1726882398.41944: done transferring module to remote 12081 1726882398.41953: _low_level_execute_command(): starting 12081 1726882398.41960: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356/ /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356/AnsiballZ_stat.py && sleep 0' 12081 1726882398.42414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882398.42427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882398.42447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882398.42459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882398.42472: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.42520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882398.42533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882398.42638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882398.44388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882398.44437: stderr chunk (state=3): >>><<< 12081 1726882398.44440: stdout chunk (state=3): >>><<< 12081 1726882398.44456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882398.44459: _low_level_execute_command(): starting 12081 1726882398.44462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356/AnsiballZ_stat.py && sleep 0' 12081 1726882398.44900: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882398.44905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882398.44937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.44949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.45291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882398.45322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882398.45340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882398.45484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882398.58654: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27515, "dev": 21, "nlink": 1, "atime": 1726882397.082162, "mtime": 1726882397.082162, "ctime": 1726882397.082162, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882398.59683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882398.59687: stdout chunk (state=3): >>><<< 12081 1726882398.59689: stderr chunk (state=3): >>><<< 12081 1726882398.59856: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27515, "dev": 21, "nlink": 1, "atime": 1726882397.082162, "mtime": 1726882397.082162, "ctime": 1726882397.082162, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882398.59861: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882398.59866: _low_level_execute_command(): starting 12081 1726882398.59868: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882398.3481631-12882-139019393817356/ > /dev/null 2>&1 && sleep 0' 12081 1726882398.60491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882398.60509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882398.60523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882398.60553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882398.60603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882398.60618: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882398.60631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.60647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882398.60662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882398.60676: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882398.60688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882398.60701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882398.60717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882398.60732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882398.60742: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882398.60757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882398.60837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882398.60858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882398.60876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882398.61016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882398.62931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882398.62935: stdout chunk (state=3): >>><<< 12081 1726882398.62941: stderr chunk (state=3): >>><<< 12081 1726882398.62968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882398.62974: handler run complete 12081 1726882398.63024: attempt loop complete, returning result 12081 1726882398.63027: _execute() done 12081 1726882398.63030: dumping result to json 12081 1726882398.63034: done dumping result, returning 12081 1726882398.63043: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [0e448fcc-3ce9-0a3f-ff3c-0000000004af] 12081 1726882398.63053: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004af 12081 1726882398.63177: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004af 12081 1726882398.63180: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882397.082162, "block_size": 4096, "blocks": 0, "ctime": 1726882397.082162, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27515, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726882397.082162, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12081 1726882398.63320: no more pending results, returning what we have 12081 1726882398.63324: results queue empty 12081 1726882398.63325: checking for any_errors_fatal 12081 1726882398.63327: done checking for any_errors_fatal 12081 1726882398.63328: checking for max_fail_percentage 12081 1726882398.63330: done checking for max_fail_percentage 12081 1726882398.63331: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.63332: done checking to see if all hosts have failed 12081 1726882398.63333: getting the remaining hosts for this loop 12081 1726882398.63335: done getting the remaining hosts for this loop 12081 1726882398.63339: getting the next task for host managed_node3 12081 1726882398.63350: done getting next task for host managed_node3 12081 1726882398.63353: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12081 1726882398.63357: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.63365: getting variables 12081 1726882398.63367: in VariableManager get_vars() 12081 1726882398.63398: Calling all_inventory to load vars for managed_node3 12081 1726882398.63400: Calling groups_inventory to load vars for managed_node3 12081 1726882398.63405: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.63417: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.63419: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.63422: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.65191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.66999: done with get_vars() 12081 1726882398.67027: done getting variables 12081 1726882398.67100: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882398.67235: variable 'interface' from source: task vars 12081 1726882398.67239: variable 'controller_device' from source: play vars 12081 1726882398.67308: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:18 -0400 (0:00:00.367) 0:00:18.476 ****** 12081 1726882398.67340: entering _queue_task() for managed_node3/assert 12081 1726882398.67676: worker is 1 (out of 1 available) 12081 1726882398.67693: exiting _queue_task() for managed_node3/assert 12081 1726882398.67707: done queuing things up, now waiting for results queue to drain 12081 1726882398.67708: waiting for pending results... 12081 1726882398.68069: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 12081 1726882398.68225: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000003f6 12081 1726882398.68244: variable 'ansible_search_path' from source: unknown 12081 1726882398.68247: variable 'ansible_search_path' from source: unknown 12081 1726882398.68290: calling self._execute() 12081 1726882398.68395: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.68402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.68416: variable 'omit' from source: magic vars 12081 1726882398.68816: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.68830: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.68836: variable 'omit' from source: magic vars 12081 1726882398.68909: variable 'omit' from source: magic vars 12081 1726882398.69010: variable 'interface' from source: task vars 12081 1726882398.69014: variable 'controller_device' from source: play vars 12081 1726882398.69083: variable 'controller_device' from source: play vars 12081 1726882398.69102: variable 'omit' from source: magic vars 12081 1726882398.69214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882398.69217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882398.69220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882398.69668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.69671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.69674: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882398.69676: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.69679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.69681: Set connection var ansible_pipelining to False 12081 1726882398.69683: Set connection var ansible_shell_type to sh 12081 1726882398.69685: Set connection var ansible_shell_executable to /bin/sh 12081 1726882398.69687: Set connection var ansible_connection to ssh 12081 1726882398.69689: Set connection var ansible_timeout to 10 12081 1726882398.69691: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882398.69693: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.69695: variable 'ansible_connection' from source: unknown 12081 1726882398.69697: variable 'ansible_module_compression' from source: unknown 12081 1726882398.69699: variable 'ansible_shell_type' from source: unknown 12081 1726882398.69701: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.69703: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.69704: variable 'ansible_pipelining' from source: unknown 12081 1726882398.69707: variable 'ansible_timeout' from source: unknown 12081 1726882398.69709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.69715: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882398.69717: variable 'omit' from source: magic vars 12081 1726882398.69720: starting attempt loop 12081 1726882398.69722: running the handler 12081 1726882398.69794: variable 'interface_stat' from source: set_fact 12081 1726882398.69821: Evaluated conditional (interface_stat.stat.exists): True 12081 1726882398.69834: handler run complete 12081 1726882398.69853: attempt loop complete, returning result 12081 1726882398.69864: _execute() done 12081 1726882398.69877: dumping result to json 12081 1726882398.69885: done dumping result, returning 12081 1726882398.69896: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [0e448fcc-3ce9-0a3f-ff3c-0000000003f6] 12081 1726882398.69906: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000003f6 12081 1726882398.70016: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000003f6 12081 1726882398.70023: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882398.70094: no more pending results, returning what we have 12081 1726882398.70098: results queue empty 12081 1726882398.70099: checking for any_errors_fatal 12081 1726882398.70110: done checking for any_errors_fatal 12081 1726882398.70111: checking for max_fail_percentage 12081 1726882398.70114: done checking for max_fail_percentage 12081 1726882398.70115: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.70117: done checking to see if all hosts have failed 12081 1726882398.70117: getting the remaining hosts for this loop 12081 1726882398.70119: done getting the remaining hosts for this loop 12081 1726882398.70124: getting the next task for host managed_node3 12081 1726882398.70136: done getting next task for host managed_node3 12081 1726882398.70140: ^ task is: TASK: Include the task 'assert_profile_present.yml' 12081 1726882398.70144: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.70150: getting variables 12081 1726882398.70152: in VariableManager get_vars() 12081 1726882398.70190: Calling all_inventory to load vars for managed_node3 12081 1726882398.70193: Calling groups_inventory to load vars for managed_node3 12081 1726882398.70197: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.70211: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.70214: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.70216: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.72086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.73808: done with get_vars() 12081 1726882398.73835: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml:3 Friday 20 September 2024 21:33:18 -0400 (0:00:00.065) 0:00:18.542 ****** 12081 1726882398.73942: entering _queue_task() for managed_node3/include_tasks 12081 1726882398.74289: worker is 1 (out of 1 available) 12081 1726882398.74305: exiting _queue_task() for managed_node3/include_tasks 12081 1726882398.74318: done queuing things up, now waiting for results queue to drain 12081 1726882398.74319: waiting for pending results... 12081 1726882398.74632: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 12081 1726882398.74754: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000003fb 12081 1726882398.74783: variable 'ansible_search_path' from source: unknown 12081 1726882398.74791: variable 'ansible_search_path' from source: unknown 12081 1726882398.74850: variable 'controller_profile' from source: play vars 12081 1726882398.75069: variable 'controller_profile' from source: play vars 12081 1726882398.75094: variable 'port1_profile' from source: play vars 12081 1726882398.75173: variable 'port1_profile' from source: play vars 12081 1726882398.75185: variable 'port2_profile' from source: play vars 12081 1726882398.75257: variable 'port2_profile' from source: play vars 12081 1726882398.75280: variable 'omit' from source: magic vars 12081 1726882398.75438: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.75452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.75471: variable 'omit' from source: magic vars 12081 1726882398.75741: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.75759: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.75805: variable 'bond_port_profile' from source: unknown 12081 1726882398.75876: variable 'bond_port_profile' from source: unknown 12081 1726882398.76109: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.76123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.76138: variable 'omit' from source: magic vars 12081 1726882398.76313: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.76323: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.76353: variable 'bond_port_profile' from source: unknown 12081 1726882398.76423: variable 'bond_port_profile' from source: unknown 12081 1726882398.76552: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.76565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.76577: variable 'omit' from source: magic vars 12081 1726882398.76735: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.76745: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.76781: variable 'bond_port_profile' from source: unknown 12081 1726882398.76852: variable 'bond_port_profile' from source: unknown 12081 1726882398.76931: dumping result to json 12081 1726882398.76946: done dumping result, returning 12081 1726882398.76956: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-0a3f-ff3c-0000000003fb] 12081 1726882398.76968: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000003fb 12081 1726882398.77034: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000003fb 12081 1726882398.77043: WORKER PROCESS EXITING 12081 1726882398.77084: no more pending results, returning what we have 12081 1726882398.77090: in VariableManager get_vars() 12081 1726882398.77125: Calling all_inventory to load vars for managed_node3 12081 1726882398.77128: Calling groups_inventory to load vars for managed_node3 12081 1726882398.77131: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.77145: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.77147: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.77150: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.78919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.80665: done with get_vars() 12081 1726882398.80691: variable 'ansible_search_path' from source: unknown 12081 1726882398.80692: variable 'ansible_search_path' from source: unknown 12081 1726882398.80702: variable 'item' from source: include params 12081 1726882398.80818: variable 'item' from source: include params 12081 1726882398.80866: variable 'ansible_search_path' from source: unknown 12081 1726882398.80868: variable 'ansible_search_path' from source: unknown 12081 1726882398.80875: variable 'item' from source: include params 12081 1726882398.80937: variable 'item' from source: include params 12081 1726882398.80975: variable 'ansible_search_path' from source: unknown 12081 1726882398.80976: variable 'ansible_search_path' from source: unknown 12081 1726882398.80982: variable 'item' from source: include params 12081 1726882398.81037: variable 'item' from source: include params 12081 1726882398.81073: we have included files to process 12081 1726882398.81074: generating all_blocks data 12081 1726882398.81076: done generating all_blocks data 12081 1726882398.81080: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.81081: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.81083: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.81284: in VariableManager get_vars() 12081 1726882398.81305: done with get_vars() 12081 1726882398.81567: done processing included file 12081 1726882398.81569: iterating over new_blocks loaded from include file 12081 1726882398.81570: in VariableManager get_vars() 12081 1726882398.81585: done with get_vars() 12081 1726882398.81586: filtering new block on tags 12081 1726882398.81649: done filtering new block on tags 12081 1726882398.81652: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 12081 1726882398.81657: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.81658: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.81661: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.81769: in VariableManager get_vars() 12081 1726882398.81788: done with get_vars() 12081 1726882398.82022: done processing included file 12081 1726882398.82024: iterating over new_blocks loaded from include file 12081 1726882398.82029: in VariableManager get_vars() 12081 1726882398.82215: done with get_vars() 12081 1726882398.82218: filtering new block on tags 12081 1726882398.82277: done filtering new block on tags 12081 1726882398.82279: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 12081 1726882398.82283: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.82285: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.82287: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12081 1726882398.82386: in VariableManager get_vars() 12081 1726882398.82401: done with get_vars() 12081 1726882398.82616: done processing included file 12081 1726882398.82618: iterating over new_blocks loaded from include file 12081 1726882398.82619: in VariableManager get_vars() 12081 1726882398.82633: done with get_vars() 12081 1726882398.82635: filtering new block on tags 12081 1726882398.82700: done filtering new block on tags 12081 1726882398.82703: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 12081 1726882398.82707: extending task lists for all hosts with included blocks 12081 1726882398.82825: done extending task lists 12081 1726882398.82827: done processing included files 12081 1726882398.82828: results queue empty 12081 1726882398.82829: checking for any_errors_fatal 12081 1726882398.82833: done checking for any_errors_fatal 12081 1726882398.82834: checking for max_fail_percentage 12081 1726882398.82835: done checking for max_fail_percentage 12081 1726882398.82836: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.82837: done checking to see if all hosts have failed 12081 1726882398.82838: getting the remaining hosts for this loop 12081 1726882398.82839: done getting the remaining hosts for this loop 12081 1726882398.82842: getting the next task for host managed_node3 12081 1726882398.82846: done getting next task for host managed_node3 12081 1726882398.82848: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12081 1726882398.82852: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.82854: getting variables 12081 1726882398.82855: in VariableManager get_vars() 12081 1726882398.82867: Calling all_inventory to load vars for managed_node3 12081 1726882398.82869: Calling groups_inventory to load vars for managed_node3 12081 1726882398.82872: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.82878: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.82880: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.82883: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.84184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.85923: done with get_vars() 12081 1726882398.85953: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:18 -0400 (0:00:00.120) 0:00:18.663 ****** 12081 1726882398.86028: entering _queue_task() for managed_node3/include_tasks 12081 1726882398.86389: worker is 1 (out of 1 available) 12081 1726882398.86403: exiting _queue_task() for managed_node3/include_tasks 12081 1726882398.86416: done queuing things up, now waiting for results queue to drain 12081 1726882398.86417: waiting for pending results... 12081 1726882398.86726: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 12081 1726882398.86849: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004d9 12081 1726882398.86877: variable 'ansible_search_path' from source: unknown 12081 1726882398.86885: variable 'ansible_search_path' from source: unknown 12081 1726882398.86932: calling self._execute() 12081 1726882398.87036: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.87048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.87064: variable 'omit' from source: magic vars 12081 1726882398.87455: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.87481: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.87492: _execute() done 12081 1726882398.87500: dumping result to json 12081 1726882398.87510: done dumping result, returning 12081 1726882398.87523: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-0000000004d9] 12081 1726882398.87535: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004d9 12081 1726882398.87668: no more pending results, returning what we have 12081 1726882398.87676: in VariableManager get_vars() 12081 1726882398.87714: Calling all_inventory to load vars for managed_node3 12081 1726882398.87717: Calling groups_inventory to load vars for managed_node3 12081 1726882398.87722: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.87737: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.87740: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.87744: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.88849: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004d9 12081 1726882398.88855: WORKER PROCESS EXITING 12081 1726882398.89636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.91360: done with get_vars() 12081 1726882398.91384: variable 'ansible_search_path' from source: unknown 12081 1726882398.91386: variable 'ansible_search_path' from source: unknown 12081 1726882398.91424: we have included files to process 12081 1726882398.91425: generating all_blocks data 12081 1726882398.91426: done generating all_blocks data 12081 1726882398.91428: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882398.91429: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882398.91431: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882398.92509: done processing included file 12081 1726882398.92511: iterating over new_blocks loaded from include file 12081 1726882398.92513: in VariableManager get_vars() 12081 1726882398.92529: done with get_vars() 12081 1726882398.92531: filtering new block on tags 12081 1726882398.92677: done filtering new block on tags 12081 1726882398.92681: in VariableManager get_vars() 12081 1726882398.92697: done with get_vars() 12081 1726882398.92699: filtering new block on tags 12081 1726882398.92763: done filtering new block on tags 12081 1726882398.92766: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 12081 1726882398.92773: extending task lists for all hosts with included blocks 12081 1726882398.93147: done extending task lists 12081 1726882398.93149: done processing included files 12081 1726882398.93149: results queue empty 12081 1726882398.93150: checking for any_errors_fatal 12081 1726882398.93153: done checking for any_errors_fatal 12081 1726882398.93154: checking for max_fail_percentage 12081 1726882398.93155: done checking for max_fail_percentage 12081 1726882398.93156: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.93157: done checking to see if all hosts have failed 12081 1726882398.93158: getting the remaining hosts for this loop 12081 1726882398.93159: done getting the remaining hosts for this loop 12081 1726882398.93162: getting the next task for host managed_node3 12081 1726882398.93169: done getting next task for host managed_node3 12081 1726882398.93171: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12081 1726882398.93175: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.93177: getting variables 12081 1726882398.93178: in VariableManager get_vars() 12081 1726882398.93187: Calling all_inventory to load vars for managed_node3 12081 1726882398.93189: Calling groups_inventory to load vars for managed_node3 12081 1726882398.93196: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.93202: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.93205: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.93208: Calling groups_plugins_play to load vars for managed_node3 12081 1726882398.94430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882398.96256: done with get_vars() 12081 1726882398.96280: done getting variables 12081 1726882398.96319: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:18 -0400 (0:00:00.103) 0:00:18.766 ****** 12081 1726882398.96354: entering _queue_task() for managed_node3/set_fact 12081 1726882398.96687: worker is 1 (out of 1 available) 12081 1726882398.96699: exiting _queue_task() for managed_node3/set_fact 12081 1726882398.96711: done queuing things up, now waiting for results queue to drain 12081 1726882398.96712: waiting for pending results... 12081 1726882398.96995: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 12081 1726882398.97140: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004fc 12081 1726882398.97166: variable 'ansible_search_path' from source: unknown 12081 1726882398.97175: variable 'ansible_search_path' from source: unknown 12081 1726882398.97221: calling self._execute() 12081 1726882398.97322: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.97338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.97351: variable 'omit' from source: magic vars 12081 1726882398.97744: variable 'ansible_distribution_major_version' from source: facts 12081 1726882398.97768: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882398.97781: variable 'omit' from source: magic vars 12081 1726882398.97850: variable 'omit' from source: magic vars 12081 1726882398.97893: variable 'omit' from source: magic vars 12081 1726882398.97943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882398.97987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882398.98013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882398.98040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.98057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882398.98098: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882398.98107: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.98114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.98229: Set connection var ansible_pipelining to False 12081 1726882398.98243: Set connection var ansible_shell_type to sh 12081 1726882398.98256: Set connection var ansible_shell_executable to /bin/sh 12081 1726882398.98266: Set connection var ansible_connection to ssh 12081 1726882398.98277: Set connection var ansible_timeout to 10 12081 1726882398.98286: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882398.98319: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.98327: variable 'ansible_connection' from source: unknown 12081 1726882398.98334: variable 'ansible_module_compression' from source: unknown 12081 1726882398.98340: variable 'ansible_shell_type' from source: unknown 12081 1726882398.98351: variable 'ansible_shell_executable' from source: unknown 12081 1726882398.98357: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882398.98367: variable 'ansible_pipelining' from source: unknown 12081 1726882398.98374: variable 'ansible_timeout' from source: unknown 12081 1726882398.98383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882398.98532: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882398.98548: variable 'omit' from source: magic vars 12081 1726882398.98557: starting attempt loop 12081 1726882398.98568: running the handler 12081 1726882398.98587: handler run complete 12081 1726882398.98600: attempt loop complete, returning result 12081 1726882398.98606: _execute() done 12081 1726882398.98613: dumping result to json 12081 1726882398.98620: done dumping result, returning 12081 1726882398.98634: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-0a3f-ff3c-0000000004fc] 12081 1726882398.98645: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004fc 12081 1726882398.98751: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004fc 12081 1726882398.98758: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12081 1726882398.98832: no more pending results, returning what we have 12081 1726882398.98837: results queue empty 12081 1726882398.98837: checking for any_errors_fatal 12081 1726882398.98839: done checking for any_errors_fatal 12081 1726882398.98840: checking for max_fail_percentage 12081 1726882398.98842: done checking for max_fail_percentage 12081 1726882398.98843: checking to see if all hosts have failed and the running result is not ok 12081 1726882398.98844: done checking to see if all hosts have failed 12081 1726882398.98845: getting the remaining hosts for this loop 12081 1726882398.98847: done getting the remaining hosts for this loop 12081 1726882398.98853: getting the next task for host managed_node3 12081 1726882398.98863: done getting next task for host managed_node3 12081 1726882398.98870: ^ task is: TASK: Stat profile file 12081 1726882398.98878: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882398.98883: getting variables 12081 1726882398.98886: in VariableManager get_vars() 12081 1726882398.98916: Calling all_inventory to load vars for managed_node3 12081 1726882398.98919: Calling groups_inventory to load vars for managed_node3 12081 1726882398.98922: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882398.98933: Calling all_plugins_play to load vars for managed_node3 12081 1726882398.98936: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882398.98938: Calling groups_plugins_play to load vars for managed_node3 12081 1726882399.00658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882399.02419: done with get_vars() 12081 1726882399.02452: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:19 -0400 (0:00:00.062) 0:00:18.828 ****** 12081 1726882399.02575: entering _queue_task() for managed_node3/stat 12081 1726882399.02935: worker is 1 (out of 1 available) 12081 1726882399.02956: exiting _queue_task() for managed_node3/stat 12081 1726882399.02976: done queuing things up, now waiting for results queue to drain 12081 1726882399.02977: waiting for pending results... 12081 1726882399.03300: running TaskExecutor() for managed_node3/TASK: Stat profile file 12081 1726882399.03443: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004fd 12081 1726882399.03467: variable 'ansible_search_path' from source: unknown 12081 1726882399.03477: variable 'ansible_search_path' from source: unknown 12081 1726882399.03530: calling self._execute() 12081 1726882399.03635: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.03647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.03666: variable 'omit' from source: magic vars 12081 1726882399.04072: variable 'ansible_distribution_major_version' from source: facts 12081 1726882399.04092: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882399.04104: variable 'omit' from source: magic vars 12081 1726882399.04175: variable 'omit' from source: magic vars 12081 1726882399.04277: variable 'profile' from source: include params 12081 1726882399.04289: variable 'bond_port_profile' from source: include params 12081 1726882399.04356: variable 'bond_port_profile' from source: include params 12081 1726882399.04393: variable 'omit' from source: magic vars 12081 1726882399.04441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882399.04484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882399.04513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882399.04537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882399.04555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882399.04598: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882399.04612: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.04620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.04736: Set connection var ansible_pipelining to False 12081 1726882399.04744: Set connection var ansible_shell_type to sh 12081 1726882399.04758: Set connection var ansible_shell_executable to /bin/sh 12081 1726882399.04769: Set connection var ansible_connection to ssh 12081 1726882399.04780: Set connection var ansible_timeout to 10 12081 1726882399.04790: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882399.04826: variable 'ansible_shell_executable' from source: unknown 12081 1726882399.04835: variable 'ansible_connection' from source: unknown 12081 1726882399.04843: variable 'ansible_module_compression' from source: unknown 12081 1726882399.04850: variable 'ansible_shell_type' from source: unknown 12081 1726882399.04856: variable 'ansible_shell_executable' from source: unknown 12081 1726882399.04863: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.04873: variable 'ansible_pipelining' from source: unknown 12081 1726882399.04880: variable 'ansible_timeout' from source: unknown 12081 1726882399.04887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.05110: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882399.05124: variable 'omit' from source: magic vars 12081 1726882399.05137: starting attempt loop 12081 1726882399.05146: running the handler 12081 1726882399.05165: _low_level_execute_command(): starting 12081 1726882399.05176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882399.05935: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882399.05951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.05970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.05991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.06043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.06059: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882399.06078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.06095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882399.06109: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882399.06124: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882399.06137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.06151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.06174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.06187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.06199: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882399.06214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.06297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.06322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.06345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.06492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.08192: stdout chunk (state=3): >>>/root <<< 12081 1726882399.08371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.08374: stdout chunk (state=3): >>><<< 12081 1726882399.08393: stderr chunk (state=3): >>><<< 12081 1726882399.08507: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.08512: _low_level_execute_command(): starting 12081 1726882399.08515: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502 `" && echo ansible-tmp-1726882399.0841212-12909-78977735360502="` echo /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502 `" ) && sleep 0' 12081 1726882399.09085: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.09089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.09126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882399.09130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.09133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.09208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.09221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.09351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.11232: stdout chunk (state=3): >>>ansible-tmp-1726882399.0841212-12909-78977735360502=/root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502 <<< 12081 1726882399.11391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.11450: stderr chunk (state=3): >>><<< 12081 1726882399.11453: stdout chunk (state=3): >>><<< 12081 1726882399.11672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882399.0841212-12909-78977735360502=/root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.11675: variable 'ansible_module_compression' from source: unknown 12081 1726882399.11678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882399.11680: variable 'ansible_facts' from source: unknown 12081 1726882399.11712: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502/AnsiballZ_stat.py 12081 1726882399.11869: Sending initial data 12081 1726882399.11880: Sent initial data (152 bytes) 12081 1726882399.12897: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882399.12912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.12927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.12947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.12997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.13010: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882399.13026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.13045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882399.13057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882399.13073: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882399.13090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.13105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.13122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.13136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.13148: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882399.13162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.13245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.13262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.13280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.13414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.15154: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882399.15255: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882399.15355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpubq9kfjw /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502/AnsiballZ_stat.py <<< 12081 1726882399.15448: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882399.16769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.17032: stderr chunk (state=3): >>><<< 12081 1726882399.17035: stdout chunk (state=3): >>><<< 12081 1726882399.17038: done transferring module to remote 12081 1726882399.17040: _low_level_execute_command(): starting 12081 1726882399.17043: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502/ /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502/AnsiballZ_stat.py && sleep 0' 12081 1726882399.17647: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882399.17662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.17681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.17704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.17753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.17769: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882399.17786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.17804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882399.17822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882399.17834: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882399.17847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.17865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.17882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.17896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.17908: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882399.17927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.18004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.18021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.18041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.18188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.19920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.19991: stderr chunk (state=3): >>><<< 12081 1726882399.19994: stdout chunk (state=3): >>><<< 12081 1726882399.20013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.20016: _low_level_execute_command(): starting 12081 1726882399.20021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502/AnsiballZ_stat.py && sleep 0' 12081 1726882399.20658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882399.20670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.20680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.20694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.20732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.20739: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882399.20748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.20766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882399.20774: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882399.20781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882399.20788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.20800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.20808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.20816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.20822: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882399.20831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.20916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.20923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.20926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.21062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.33973: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882399.34930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882399.35011: stderr chunk (state=3): >>><<< 12081 1726882399.35014: stdout chunk (state=3): >>><<< 12081 1726882399.35146: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882399.35150: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882399.35161: _low_level_execute_command(): starting 12081 1726882399.35165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882399.0841212-12909-78977735360502/ > /dev/null 2>&1 && sleep 0' 12081 1726882399.35744: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882399.35761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.35778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.35796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.35840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.35853: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882399.35875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.35892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882399.35904: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882399.35914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882399.35925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.35937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.35954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.35970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.35982: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882399.35994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.36074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.36096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.36111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.36243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.38089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.38198: stderr chunk (state=3): >>><<< 12081 1726882399.38202: stdout chunk (state=3): >>><<< 12081 1726882399.38270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.38273: handler run complete 12081 1726882399.38276: attempt loop complete, returning result 12081 1726882399.38278: _execute() done 12081 1726882399.38280: dumping result to json 12081 1726882399.38282: done dumping result, returning 12081 1726882399.38473: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0e448fcc-3ce9-0a3f-ff3c-0000000004fd] 12081 1726882399.38476: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004fd 12081 1726882399.38555: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004fd 12081 1726882399.38558: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12081 1726882399.38640: no more pending results, returning what we have 12081 1726882399.38644: results queue empty 12081 1726882399.38645: checking for any_errors_fatal 12081 1726882399.38657: done checking for any_errors_fatal 12081 1726882399.38658: checking for max_fail_percentage 12081 1726882399.38661: done checking for max_fail_percentage 12081 1726882399.38662: checking to see if all hosts have failed and the running result is not ok 12081 1726882399.38665: done checking to see if all hosts have failed 12081 1726882399.38666: getting the remaining hosts for this loop 12081 1726882399.38668: done getting the remaining hosts for this loop 12081 1726882399.38673: getting the next task for host managed_node3 12081 1726882399.38682: done getting next task for host managed_node3 12081 1726882399.38685: ^ task is: TASK: Set NM profile exist flag based on the profile files 12081 1726882399.38691: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882399.38696: getting variables 12081 1726882399.38698: in VariableManager get_vars() 12081 1726882399.38732: Calling all_inventory to load vars for managed_node3 12081 1726882399.38735: Calling groups_inventory to load vars for managed_node3 12081 1726882399.38739: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882399.38756: Calling all_plugins_play to load vars for managed_node3 12081 1726882399.38760: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882399.38769: Calling groups_plugins_play to load vars for managed_node3 12081 1726882399.40691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882399.42572: done with get_vars() 12081 1726882399.42616: done getting variables 12081 1726882399.42692: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:19 -0400 (0:00:00.401) 0:00:19.230 ****** 12081 1726882399.42741: entering _queue_task() for managed_node3/set_fact 12081 1726882399.43125: worker is 1 (out of 1 available) 12081 1726882399.43142: exiting _queue_task() for managed_node3/set_fact 12081 1726882399.43158: done queuing things up, now waiting for results queue to drain 12081 1726882399.43159: waiting for pending results... 12081 1726882399.43483: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 12081 1726882399.43633: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004fe 12081 1726882399.43655: variable 'ansible_search_path' from source: unknown 12081 1726882399.43667: variable 'ansible_search_path' from source: unknown 12081 1726882399.43722: calling self._execute() 12081 1726882399.43839: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.43853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.43871: variable 'omit' from source: magic vars 12081 1726882399.44293: variable 'ansible_distribution_major_version' from source: facts 12081 1726882399.44313: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882399.44468: variable 'profile_stat' from source: set_fact 12081 1726882399.44486: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882399.44494: when evaluation is False, skipping this task 12081 1726882399.44501: _execute() done 12081 1726882399.44508: dumping result to json 12081 1726882399.44515: done dumping result, returning 12081 1726882399.44525: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-0a3f-ff3c-0000000004fe] 12081 1726882399.44539: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004fe 12081 1726882399.44673: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004fe 12081 1726882399.44681: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882399.44736: no more pending results, returning what we have 12081 1726882399.44741: results queue empty 12081 1726882399.44742: checking for any_errors_fatal 12081 1726882399.44755: done checking for any_errors_fatal 12081 1726882399.44756: checking for max_fail_percentage 12081 1726882399.44758: done checking for max_fail_percentage 12081 1726882399.44759: checking to see if all hosts have failed and the running result is not ok 12081 1726882399.44761: done checking to see if all hosts have failed 12081 1726882399.44762: getting the remaining hosts for this loop 12081 1726882399.44765: done getting the remaining hosts for this loop 12081 1726882399.44770: getting the next task for host managed_node3 12081 1726882399.44780: done getting next task for host managed_node3 12081 1726882399.44783: ^ task is: TASK: Get NM profile info 12081 1726882399.44791: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882399.44796: getting variables 12081 1726882399.44798: in VariableManager get_vars() 12081 1726882399.44833: Calling all_inventory to load vars for managed_node3 12081 1726882399.44836: Calling groups_inventory to load vars for managed_node3 12081 1726882399.44840: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882399.44859: Calling all_plugins_play to load vars for managed_node3 12081 1726882399.44863: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882399.44869: Calling groups_plugins_play to load vars for managed_node3 12081 1726882399.46741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882399.49570: done with get_vars() 12081 1726882399.49603: done getting variables 12081 1726882399.49807: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:19 -0400 (0:00:00.071) 0:00:19.301 ****** 12081 1726882399.49844: entering _queue_task() for managed_node3/shell 12081 1726882399.50393: worker is 1 (out of 1 available) 12081 1726882399.50404: exiting _queue_task() for managed_node3/shell 12081 1726882399.50422: done queuing things up, now waiting for results queue to drain 12081 1726882399.50423: waiting for pending results... 12081 1726882399.50817: running TaskExecutor() for managed_node3/TASK: Get NM profile info 12081 1726882399.50883: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004ff 12081 1726882399.50896: variable 'ansible_search_path' from source: unknown 12081 1726882399.50899: variable 'ansible_search_path' from source: unknown 12081 1726882399.50938: calling self._execute() 12081 1726882399.51034: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.51039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.51049: variable 'omit' from source: magic vars 12081 1726882399.51445: variable 'ansible_distribution_major_version' from source: facts 12081 1726882399.51459: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882399.51467: variable 'omit' from source: magic vars 12081 1726882399.51535: variable 'omit' from source: magic vars 12081 1726882399.51642: variable 'profile' from source: include params 12081 1726882399.51645: variable 'bond_port_profile' from source: include params 12081 1726882399.51720: variable 'bond_port_profile' from source: include params 12081 1726882399.51787: variable 'omit' from source: magic vars 12081 1726882399.52116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882399.52168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882399.52197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882399.52214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882399.52226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882399.52257: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882399.52261: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.52269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.52377: Set connection var ansible_pipelining to False 12081 1726882399.52380: Set connection var ansible_shell_type to sh 12081 1726882399.52387: Set connection var ansible_shell_executable to /bin/sh 12081 1726882399.52390: Set connection var ansible_connection to ssh 12081 1726882399.52400: Set connection var ansible_timeout to 10 12081 1726882399.52405: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882399.52431: variable 'ansible_shell_executable' from source: unknown 12081 1726882399.52434: variable 'ansible_connection' from source: unknown 12081 1726882399.52437: variable 'ansible_module_compression' from source: unknown 12081 1726882399.52440: variable 'ansible_shell_type' from source: unknown 12081 1726882399.52442: variable 'ansible_shell_executable' from source: unknown 12081 1726882399.52444: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.52446: variable 'ansible_pipelining' from source: unknown 12081 1726882399.52449: variable 'ansible_timeout' from source: unknown 12081 1726882399.52457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.52593: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882399.52604: variable 'omit' from source: magic vars 12081 1726882399.52610: starting attempt loop 12081 1726882399.52617: running the handler 12081 1726882399.52629: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882399.52648: _low_level_execute_command(): starting 12081 1726882399.52656: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882399.53368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.53378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.53410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.53431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.53434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.53484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.53493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.53615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.55314: stdout chunk (state=3): >>>/root <<< 12081 1726882399.55683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.55686: stderr chunk (state=3): >>><<< 12081 1726882399.55688: stdout chunk (state=3): >>><<< 12081 1726882399.55692: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.55695: _low_level_execute_command(): starting 12081 1726882399.55698: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645 `" && echo ansible-tmp-1726882399.555561-12928-7333524666645="` echo /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645 `" ) && sleep 0' 12081 1726882399.56274: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882399.56286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.56291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.56308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.56356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.56359: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882399.56370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.56386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882399.56394: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882399.56400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882399.56408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.56417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.56436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.56443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882399.56454: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882399.56460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.56536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.56562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.56578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.56708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.58608: stdout chunk (state=3): >>>ansible-tmp-1726882399.555561-12928-7333524666645=/root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645 <<< 12081 1726882399.58711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.58775: stderr chunk (state=3): >>><<< 12081 1726882399.58779: stdout chunk (state=3): >>><<< 12081 1726882399.58797: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882399.555561-12928-7333524666645=/root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.58826: variable 'ansible_module_compression' from source: unknown 12081 1726882399.58872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882399.58906: variable 'ansible_facts' from source: unknown 12081 1726882399.58980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645/AnsiballZ_command.py 12081 1726882399.59110: Sending initial data 12081 1726882399.59114: Sent initial data (153 bytes) 12081 1726882399.60198: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.60296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.62023: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882399.62114: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882399.62211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp7s570m88 /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645/AnsiballZ_command.py <<< 12081 1726882399.62308: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882399.63400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.63492: stderr chunk (state=3): >>><<< 12081 1726882399.63496: stdout chunk (state=3): >>><<< 12081 1726882399.63512: done transferring module to remote 12081 1726882399.63523: _low_level_execute_command(): starting 12081 1726882399.63526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645/ /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645/AnsiballZ_command.py && sleep 0' 12081 1726882399.63959: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.63967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.64013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.64016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.64022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.64083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.64087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.64193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.65922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.65973: stderr chunk (state=3): >>><<< 12081 1726882399.65976: stdout chunk (state=3): >>><<< 12081 1726882399.65990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.65993: _low_level_execute_command(): starting 12081 1726882399.65998: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645/AnsiballZ_command.py && sleep 0' 12081 1726882399.66426: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.66433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.66471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.66477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.66485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.66495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.66500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.66550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.66568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.66687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.81953: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:19.795559", "end": "2024-09-20 21:33:19.818067", "delta": "0:00:00.022508", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882399.83152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882399.83208: stderr chunk (state=3): >>><<< 12081 1726882399.83212: stdout chunk (state=3): >>><<< 12081 1726882399.83229: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:19.795559", "end": "2024-09-20 21:33:19.818067", "delta": "0:00:00.022508", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882399.83260: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882399.83268: _low_level_execute_command(): starting 12081 1726882399.83276: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882399.555561-12928-7333524666645/ > /dev/null 2>&1 && sleep 0' 12081 1726882399.83743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882399.83747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882399.83785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882399.83788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882399.83791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882399.83846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882399.83849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882399.83858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882399.83961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882399.85772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882399.85827: stderr chunk (state=3): >>><<< 12081 1726882399.85831: stdout chunk (state=3): >>><<< 12081 1726882399.85846: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882399.85855: handler run complete 12081 1726882399.85877: Evaluated conditional (False): False 12081 1726882399.85885: attempt loop complete, returning result 12081 1726882399.85888: _execute() done 12081 1726882399.85890: dumping result to json 12081 1726882399.85897: done dumping result, returning 12081 1726882399.85904: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0e448fcc-3ce9-0a3f-ff3c-0000000004ff] 12081 1726882399.85913: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004ff 12081 1726882399.86014: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004ff 12081 1726882399.86017: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.022508", "end": "2024-09-20 21:33:19.818067", "rc": 0, "start": "2024-09-20 21:33:19.795559" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 12081 1726882399.86093: no more pending results, returning what we have 12081 1726882399.86097: results queue empty 12081 1726882399.86098: checking for any_errors_fatal 12081 1726882399.86104: done checking for any_errors_fatal 12081 1726882399.86105: checking for max_fail_percentage 12081 1726882399.86107: done checking for max_fail_percentage 12081 1726882399.86108: checking to see if all hosts have failed and the running result is not ok 12081 1726882399.86109: done checking to see if all hosts have failed 12081 1726882399.86110: getting the remaining hosts for this loop 12081 1726882399.86112: done getting the remaining hosts for this loop 12081 1726882399.86115: getting the next task for host managed_node3 12081 1726882399.86123: done getting next task for host managed_node3 12081 1726882399.86127: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12081 1726882399.86133: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882399.86137: getting variables 12081 1726882399.86138: in VariableManager get_vars() 12081 1726882399.86171: Calling all_inventory to load vars for managed_node3 12081 1726882399.86174: Calling groups_inventory to load vars for managed_node3 12081 1726882399.86177: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882399.86188: Calling all_plugins_play to load vars for managed_node3 12081 1726882399.86190: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882399.86193: Calling groups_plugins_play to load vars for managed_node3 12081 1726882399.87169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882399.88112: done with get_vars() 12081 1726882399.88131: done getting variables 12081 1726882399.88180: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:19 -0400 (0:00:00.383) 0:00:19.685 ****** 12081 1726882399.88211: entering _queue_task() for managed_node3/set_fact 12081 1726882399.88449: worker is 1 (out of 1 available) 12081 1726882399.88466: exiting _queue_task() for managed_node3/set_fact 12081 1726882399.88480: done queuing things up, now waiting for results queue to drain 12081 1726882399.88481: waiting for pending results... 12081 1726882399.88662: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12081 1726882399.88754: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000500 12081 1726882399.88765: variable 'ansible_search_path' from source: unknown 12081 1726882399.88768: variable 'ansible_search_path' from source: unknown 12081 1726882399.88799: calling self._execute() 12081 1726882399.88869: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.88873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.88882: variable 'omit' from source: magic vars 12081 1726882399.89149: variable 'ansible_distribution_major_version' from source: facts 12081 1726882399.89161: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882399.89253: variable 'nm_profile_exists' from source: set_fact 12081 1726882399.89262: Evaluated conditional (nm_profile_exists.rc == 0): True 12081 1726882399.89269: variable 'omit' from source: magic vars 12081 1726882399.89311: variable 'omit' from source: magic vars 12081 1726882399.89333: variable 'omit' from source: magic vars 12081 1726882399.89370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882399.89396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882399.89416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882399.89427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882399.89437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882399.89464: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882399.89468: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.89470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.89543: Set connection var ansible_pipelining to False 12081 1726882399.89547: Set connection var ansible_shell_type to sh 12081 1726882399.89555: Set connection var ansible_shell_executable to /bin/sh 12081 1726882399.89558: Set connection var ansible_connection to ssh 12081 1726882399.89560: Set connection var ansible_timeout to 10 12081 1726882399.89566: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882399.89585: variable 'ansible_shell_executable' from source: unknown 12081 1726882399.89588: variable 'ansible_connection' from source: unknown 12081 1726882399.89591: variable 'ansible_module_compression' from source: unknown 12081 1726882399.89593: variable 'ansible_shell_type' from source: unknown 12081 1726882399.89595: variable 'ansible_shell_executable' from source: unknown 12081 1726882399.89598: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.89602: variable 'ansible_pipelining' from source: unknown 12081 1726882399.89604: variable 'ansible_timeout' from source: unknown 12081 1726882399.89608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.89709: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882399.89717: variable 'omit' from source: magic vars 12081 1726882399.89721: starting attempt loop 12081 1726882399.89726: running the handler 12081 1726882399.89740: handler run complete 12081 1726882399.89747: attempt loop complete, returning result 12081 1726882399.89750: _execute() done 12081 1726882399.89755: dumping result to json 12081 1726882399.89757: done dumping result, returning 12081 1726882399.89763: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-0a3f-ff3c-000000000500] 12081 1726882399.89771: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000500 12081 1726882399.89862: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000500 12081 1726882399.89868: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12081 1726882399.89918: no more pending results, returning what we have 12081 1726882399.89921: results queue empty 12081 1726882399.89922: checking for any_errors_fatal 12081 1726882399.89931: done checking for any_errors_fatal 12081 1726882399.89932: checking for max_fail_percentage 12081 1726882399.89934: done checking for max_fail_percentage 12081 1726882399.89935: checking to see if all hosts have failed and the running result is not ok 12081 1726882399.89936: done checking to see if all hosts have failed 12081 1726882399.89936: getting the remaining hosts for this loop 12081 1726882399.89938: done getting the remaining hosts for this loop 12081 1726882399.89942: getting the next task for host managed_node3 12081 1726882399.89961: done getting next task for host managed_node3 12081 1726882399.89964: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12081 1726882399.89973: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882399.89977: getting variables 12081 1726882399.89978: in VariableManager get_vars() 12081 1726882399.90006: Calling all_inventory to load vars for managed_node3 12081 1726882399.90008: Calling groups_inventory to load vars for managed_node3 12081 1726882399.90011: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882399.90020: Calling all_plugins_play to load vars for managed_node3 12081 1726882399.90023: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882399.90026: Calling groups_plugins_play to load vars for managed_node3 12081 1726882399.90889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882399.91996: done with get_vars() 12081 1726882399.92026: done getting variables 12081 1726882399.92092: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882399.92210: variable 'profile' from source: include params 12081 1726882399.92214: variable 'bond_port_profile' from source: include params 12081 1726882399.92277: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:19 -0400 (0:00:00.040) 0:00:19.726 ****** 12081 1726882399.92311: entering _queue_task() for managed_node3/command 12081 1726882399.92647: worker is 1 (out of 1 available) 12081 1726882399.92665: exiting _queue_task() for managed_node3/command 12081 1726882399.92677: done queuing things up, now waiting for results queue to drain 12081 1726882399.92679: waiting for pending results... 12081 1726882399.92960: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 12081 1726882399.93102: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000502 12081 1726882399.93126: variable 'ansible_search_path' from source: unknown 12081 1726882399.93134: variable 'ansible_search_path' from source: unknown 12081 1726882399.93181: calling self._execute() 12081 1726882399.93282: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.93293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.93307: variable 'omit' from source: magic vars 12081 1726882399.93609: variable 'ansible_distribution_major_version' from source: facts 12081 1726882399.93620: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882399.93709: variable 'profile_stat' from source: set_fact 12081 1726882399.93718: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882399.93721: when evaluation is False, skipping this task 12081 1726882399.93724: _execute() done 12081 1726882399.93726: dumping result to json 12081 1726882399.93730: done dumping result, returning 12081 1726882399.93736: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-0a3f-ff3c-000000000502] 12081 1726882399.93742: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000502 12081 1726882399.93828: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000502 12081 1726882399.93831: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882399.93922: no more pending results, returning what we have 12081 1726882399.93926: results queue empty 12081 1726882399.93927: checking for any_errors_fatal 12081 1726882399.93933: done checking for any_errors_fatal 12081 1726882399.93934: checking for max_fail_percentage 12081 1726882399.93936: done checking for max_fail_percentage 12081 1726882399.93937: checking to see if all hosts have failed and the running result is not ok 12081 1726882399.93938: done checking to see if all hosts have failed 12081 1726882399.93940: getting the remaining hosts for this loop 12081 1726882399.93941: done getting the remaining hosts for this loop 12081 1726882399.93945: getting the next task for host managed_node3 12081 1726882399.93952: done getting next task for host managed_node3 12081 1726882399.93954: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12081 1726882399.93959: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882399.93962: getting variables 12081 1726882399.93964: in VariableManager get_vars() 12081 1726882399.93990: Calling all_inventory to load vars for managed_node3 12081 1726882399.93992: Calling groups_inventory to load vars for managed_node3 12081 1726882399.93995: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882399.94005: Calling all_plugins_play to load vars for managed_node3 12081 1726882399.94007: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882399.94010: Calling groups_plugins_play to load vars for managed_node3 12081 1726882399.94797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882399.96193: done with get_vars() 12081 1726882399.96223: done getting variables 12081 1726882399.96290: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882399.96410: variable 'profile' from source: include params 12081 1726882399.96414: variable 'bond_port_profile' from source: include params 12081 1726882399.96483: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:19 -0400 (0:00:00.042) 0:00:19.768 ****** 12081 1726882399.96516: entering _queue_task() for managed_node3/set_fact 12081 1726882399.96869: worker is 1 (out of 1 available) 12081 1726882399.96882: exiting _queue_task() for managed_node3/set_fact 12081 1726882399.96894: done queuing things up, now waiting for results queue to drain 12081 1726882399.96895: waiting for pending results... 12081 1726882399.97193: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 12081 1726882399.97326: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000503 12081 1726882399.97342: variable 'ansible_search_path' from source: unknown 12081 1726882399.97346: variable 'ansible_search_path' from source: unknown 12081 1726882399.97386: calling self._execute() 12081 1726882399.97486: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882399.97490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882399.97500: variable 'omit' from source: magic vars 12081 1726882399.97883: variable 'ansible_distribution_major_version' from source: facts 12081 1726882399.97896: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882399.98030: variable 'profile_stat' from source: set_fact 12081 1726882399.98041: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882399.98044: when evaluation is False, skipping this task 12081 1726882399.98047: _execute() done 12081 1726882399.98049: dumping result to json 12081 1726882399.98052: done dumping result, returning 12081 1726882399.98067: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-0a3f-ff3c-000000000503] 12081 1726882399.98074: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000503 12081 1726882399.98170: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000503 12081 1726882399.98175: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882399.98246: no more pending results, returning what we have 12081 1726882399.98254: results queue empty 12081 1726882399.98255: checking for any_errors_fatal 12081 1726882399.98261: done checking for any_errors_fatal 12081 1726882399.98262: checking for max_fail_percentage 12081 1726882399.98265: done checking for max_fail_percentage 12081 1726882399.98266: checking to see if all hosts have failed and the running result is not ok 12081 1726882399.98268: done checking to see if all hosts have failed 12081 1726882399.98268: getting the remaining hosts for this loop 12081 1726882399.98270: done getting the remaining hosts for this loop 12081 1726882399.98276: getting the next task for host managed_node3 12081 1726882399.98285: done getting next task for host managed_node3 12081 1726882399.98288: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12081 1726882399.98295: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882399.98299: getting variables 12081 1726882399.98301: in VariableManager get_vars() 12081 1726882399.98334: Calling all_inventory to load vars for managed_node3 12081 1726882399.98337: Calling groups_inventory to load vars for managed_node3 12081 1726882399.98341: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882399.98358: Calling all_plugins_play to load vars for managed_node3 12081 1726882399.98361: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882399.98366: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.00246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.02042: done with get_vars() 12081 1726882400.02086: done getting variables 12081 1726882400.02148: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882400.02281: variable 'profile' from source: include params 12081 1726882400.02285: variable 'bond_port_profile' from source: include params 12081 1726882400.02346: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:20 -0400 (0:00:00.058) 0:00:19.826 ****** 12081 1726882400.02389: entering _queue_task() for managed_node3/command 12081 1726882400.02744: worker is 1 (out of 1 available) 12081 1726882400.02759: exiting _queue_task() for managed_node3/command 12081 1726882400.02774: done queuing things up, now waiting for results queue to drain 12081 1726882400.02775: waiting for pending results... 12081 1726882400.03065: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 12081 1726882400.03196: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000504 12081 1726882400.03209: variable 'ansible_search_path' from source: unknown 12081 1726882400.03214: variable 'ansible_search_path' from source: unknown 12081 1726882400.03258: calling self._execute() 12081 1726882400.03347: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.03358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.03369: variable 'omit' from source: magic vars 12081 1726882400.03739: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.03752: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.03889: variable 'profile_stat' from source: set_fact 12081 1726882400.03905: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882400.03908: when evaluation is False, skipping this task 12081 1726882400.03911: _execute() done 12081 1726882400.03913: dumping result to json 12081 1726882400.03916: done dumping result, returning 12081 1726882400.03922: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-0a3f-ff3c-000000000504] 12081 1726882400.03930: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000504 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882400.04080: no more pending results, returning what we have 12081 1726882400.04085: results queue empty 12081 1726882400.04086: checking for any_errors_fatal 12081 1726882400.04093: done checking for any_errors_fatal 12081 1726882400.04094: checking for max_fail_percentage 12081 1726882400.04096: done checking for max_fail_percentage 12081 1726882400.04097: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.04098: done checking to see if all hosts have failed 12081 1726882400.04099: getting the remaining hosts for this loop 12081 1726882400.04101: done getting the remaining hosts for this loop 12081 1726882400.04105: getting the next task for host managed_node3 12081 1726882400.04117: done getting next task for host managed_node3 12081 1726882400.04120: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12081 1726882400.04127: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.04131: getting variables 12081 1726882400.04132: in VariableManager get_vars() 12081 1726882400.04171: Calling all_inventory to load vars for managed_node3 12081 1726882400.04174: Calling groups_inventory to load vars for managed_node3 12081 1726882400.04179: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.04195: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.04199: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.04203: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.04800: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000504 12081 1726882400.04804: WORKER PROCESS EXITING 12081 1726882400.05968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.07712: done with get_vars() 12081 1726882400.07747: done getting variables 12081 1726882400.07811: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882400.07934: variable 'profile' from source: include params 12081 1726882400.07939: variable 'bond_port_profile' from source: include params 12081 1726882400.08005: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:20 -0400 (0:00:00.056) 0:00:19.883 ****** 12081 1726882400.08039: entering _queue_task() for managed_node3/set_fact 12081 1726882400.08388: worker is 1 (out of 1 available) 12081 1726882400.08400: exiting _queue_task() for managed_node3/set_fact 12081 1726882400.08412: done queuing things up, now waiting for results queue to drain 12081 1726882400.08414: waiting for pending results... 12081 1726882400.08704: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 12081 1726882400.08870: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000505 12081 1726882400.08874: variable 'ansible_search_path' from source: unknown 12081 1726882400.08878: variable 'ansible_search_path' from source: unknown 12081 1726882400.08977: calling self._execute() 12081 1726882400.09121: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.09126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.09128: variable 'omit' from source: magic vars 12081 1726882400.09378: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.09391: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.09560: variable 'profile_stat' from source: set_fact 12081 1726882400.09574: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882400.09578: when evaluation is False, skipping this task 12081 1726882400.09581: _execute() done 12081 1726882400.09583: dumping result to json 12081 1726882400.09586: done dumping result, returning 12081 1726882400.09594: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-0a3f-ff3c-000000000505] 12081 1726882400.09601: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000505 12081 1726882400.09696: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000505 12081 1726882400.09699: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882400.09775: no more pending results, returning what we have 12081 1726882400.09779: results queue empty 12081 1726882400.09780: checking for any_errors_fatal 12081 1726882400.09788: done checking for any_errors_fatal 12081 1726882400.09789: checking for max_fail_percentage 12081 1726882400.09791: done checking for max_fail_percentage 12081 1726882400.09792: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.09793: done checking to see if all hosts have failed 12081 1726882400.09794: getting the remaining hosts for this loop 12081 1726882400.09796: done getting the remaining hosts for this loop 12081 1726882400.09802: getting the next task for host managed_node3 12081 1726882400.09815: done getting next task for host managed_node3 12081 1726882400.09819: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12081 1726882400.09825: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.09831: getting variables 12081 1726882400.09833: in VariableManager get_vars() 12081 1726882400.09872: Calling all_inventory to load vars for managed_node3 12081 1726882400.09875: Calling groups_inventory to load vars for managed_node3 12081 1726882400.09879: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.09893: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.09896: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.09899: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.11872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.14083: done with get_vars() 12081 1726882400.14115: done getting variables 12081 1726882400.14181: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882400.14298: variable 'profile' from source: include params 12081 1726882400.14302: variable 'bond_port_profile' from source: include params 12081 1726882400.14362: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:20 -0400 (0:00:00.064) 0:00:19.948 ****** 12081 1726882400.14510: entering _queue_task() for managed_node3/assert 12081 1726882400.15187: worker is 1 (out of 1 available) 12081 1726882400.15200: exiting _queue_task() for managed_node3/assert 12081 1726882400.15211: done queuing things up, now waiting for results queue to drain 12081 1726882400.15213: waiting for pending results... 12081 1726882400.16048: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 12081 1726882400.16603: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004da 12081 1726882400.16624: variable 'ansible_search_path' from source: unknown 12081 1726882400.16631: variable 'ansible_search_path' from source: unknown 12081 1726882400.16677: calling self._execute() 12081 1726882400.16774: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.16785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.16798: variable 'omit' from source: magic vars 12081 1726882400.17304: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.17327: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.17335: variable 'omit' from source: magic vars 12081 1726882400.17404: variable 'omit' from source: magic vars 12081 1726882400.17522: variable 'profile' from source: include params 12081 1726882400.17526: variable 'bond_port_profile' from source: include params 12081 1726882400.17610: variable 'bond_port_profile' from source: include params 12081 1726882400.17630: variable 'omit' from source: magic vars 12081 1726882400.17690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882400.17720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882400.17739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882400.17768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.17780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.17808: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882400.17813: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.17815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.17923: Set connection var ansible_pipelining to False 12081 1726882400.17926: Set connection var ansible_shell_type to sh 12081 1726882400.17934: Set connection var ansible_shell_executable to /bin/sh 12081 1726882400.17936: Set connection var ansible_connection to ssh 12081 1726882400.17942: Set connection var ansible_timeout to 10 12081 1726882400.17947: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882400.17982: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.17985: variable 'ansible_connection' from source: unknown 12081 1726882400.17988: variable 'ansible_module_compression' from source: unknown 12081 1726882400.17990: variable 'ansible_shell_type' from source: unknown 12081 1726882400.17992: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.17994: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.17999: variable 'ansible_pipelining' from source: unknown 12081 1726882400.18001: variable 'ansible_timeout' from source: unknown 12081 1726882400.18005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.18161: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882400.18173: variable 'omit' from source: magic vars 12081 1726882400.18179: starting attempt loop 12081 1726882400.18182: running the handler 12081 1726882400.18294: variable 'lsr_net_profile_exists' from source: set_fact 12081 1726882400.18302: Evaluated conditional (lsr_net_profile_exists): True 12081 1726882400.18313: handler run complete 12081 1726882400.18327: attempt loop complete, returning result 12081 1726882400.18329: _execute() done 12081 1726882400.18332: dumping result to json 12081 1726882400.18335: done dumping result, returning 12081 1726882400.18341: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [0e448fcc-3ce9-0a3f-ff3c-0000000004da] 12081 1726882400.18348: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004da ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882400.18486: no more pending results, returning what we have 12081 1726882400.18490: results queue empty 12081 1726882400.18490: checking for any_errors_fatal 12081 1726882400.18496: done checking for any_errors_fatal 12081 1726882400.18496: checking for max_fail_percentage 12081 1726882400.18498: done checking for max_fail_percentage 12081 1726882400.18499: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.18500: done checking to see if all hosts have failed 12081 1726882400.18501: getting the remaining hosts for this loop 12081 1726882400.18503: done getting the remaining hosts for this loop 12081 1726882400.18509: getting the next task for host managed_node3 12081 1726882400.18516: done getting next task for host managed_node3 12081 1726882400.18518: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12081 1726882400.18523: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.18526: getting variables 12081 1726882400.18528: in VariableManager get_vars() 12081 1726882400.18562: Calling all_inventory to load vars for managed_node3 12081 1726882400.18566: Calling groups_inventory to load vars for managed_node3 12081 1726882400.18570: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.18583: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.18586: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.18589: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.19238: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004da 12081 1726882400.19242: WORKER PROCESS EXITING 12081 1726882400.21110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.23176: done with get_vars() 12081 1726882400.23199: done getting variables 12081 1726882400.23279: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882400.23504: variable 'profile' from source: include params 12081 1726882400.23508: variable 'bond_port_profile' from source: include params 12081 1726882400.23577: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:20 -0400 (0:00:00.090) 0:00:20.039 ****** 12081 1726882400.23609: entering _queue_task() for managed_node3/assert 12081 1726882400.23947: worker is 1 (out of 1 available) 12081 1726882400.23962: exiting _queue_task() for managed_node3/assert 12081 1726882400.23975: done queuing things up, now waiting for results queue to drain 12081 1726882400.23976: waiting for pending results... 12081 1726882400.24929: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 12081 1726882400.25075: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004db 12081 1726882400.25184: variable 'ansible_search_path' from source: unknown 12081 1726882400.25195: variable 'ansible_search_path' from source: unknown 12081 1726882400.25238: calling self._execute() 12081 1726882400.25352: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.25365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.25381: variable 'omit' from source: magic vars 12081 1726882400.25747: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.25771: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.25783: variable 'omit' from source: magic vars 12081 1726882400.25839: variable 'omit' from source: magic vars 12081 1726882400.25947: variable 'profile' from source: include params 12081 1726882400.25958: variable 'bond_port_profile' from source: include params 12081 1726882400.26029: variable 'bond_port_profile' from source: include params 12081 1726882400.26059: variable 'omit' from source: magic vars 12081 1726882400.26104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882400.26142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882400.26171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882400.26195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.26212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.26248: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882400.26257: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.26268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.26378: Set connection var ansible_pipelining to False 12081 1726882400.26388: Set connection var ansible_shell_type to sh 12081 1726882400.26401: Set connection var ansible_shell_executable to /bin/sh 12081 1726882400.26409: Set connection var ansible_connection to ssh 12081 1726882400.26418: Set connection var ansible_timeout to 10 12081 1726882400.26428: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882400.26456: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.26466: variable 'ansible_connection' from source: unknown 12081 1726882400.26474: variable 'ansible_module_compression' from source: unknown 12081 1726882400.26481: variable 'ansible_shell_type' from source: unknown 12081 1726882400.26487: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.26495: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.26502: variable 'ansible_pipelining' from source: unknown 12081 1726882400.26509: variable 'ansible_timeout' from source: unknown 12081 1726882400.26516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.26656: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882400.26677: variable 'omit' from source: magic vars 12081 1726882400.26688: starting attempt loop 12081 1726882400.26696: running the handler 12081 1726882400.26804: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12081 1726882400.26815: Evaluated conditional (lsr_net_profile_ansible_managed): True 12081 1726882400.26825: handler run complete 12081 1726882400.26844: attempt loop complete, returning result 12081 1726882400.26850: _execute() done 12081 1726882400.26856: dumping result to json 12081 1726882400.26863: done dumping result, returning 12081 1726882400.26876: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0e448fcc-3ce9-0a3f-ff3c-0000000004db] 12081 1726882400.26887: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004db 12081 1726882400.27023: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004db 12081 1726882400.27030: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882400.27314: no more pending results, returning what we have 12081 1726882400.27317: results queue empty 12081 1726882400.27318: checking for any_errors_fatal 12081 1726882400.27323: done checking for any_errors_fatal 12081 1726882400.27323: checking for max_fail_percentage 12081 1726882400.27325: done checking for max_fail_percentage 12081 1726882400.27326: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.27327: done checking to see if all hosts have failed 12081 1726882400.27328: getting the remaining hosts for this loop 12081 1726882400.27329: done getting the remaining hosts for this loop 12081 1726882400.27332: getting the next task for host managed_node3 12081 1726882400.27339: done getting next task for host managed_node3 12081 1726882400.27341: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12081 1726882400.27345: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.27348: getting variables 12081 1726882400.27350: in VariableManager get_vars() 12081 1726882400.27385: Calling all_inventory to load vars for managed_node3 12081 1726882400.27388: Calling groups_inventory to load vars for managed_node3 12081 1726882400.27392: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.27401: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.27404: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.27406: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.34546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.36568: done with get_vars() 12081 1726882400.36596: done getting variables 12081 1726882400.36644: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882400.36750: variable 'profile' from source: include params 12081 1726882400.36753: variable 'bond_port_profile' from source: include params 12081 1726882400.36814: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:20 -0400 (0:00:00.132) 0:00:20.171 ****** 12081 1726882400.36843: entering _queue_task() for managed_node3/assert 12081 1726882400.37173: worker is 1 (out of 1 available) 12081 1726882400.37185: exiting _queue_task() for managed_node3/assert 12081 1726882400.37198: done queuing things up, now waiting for results queue to drain 12081 1726882400.37201: waiting for pending results... 12081 1726882400.37488: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 12081 1726882400.37633: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004dc 12081 1726882400.37658: variable 'ansible_search_path' from source: unknown 12081 1726882400.37670: variable 'ansible_search_path' from source: unknown 12081 1726882400.37714: calling self._execute() 12081 1726882400.37815: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.37827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.37841: variable 'omit' from source: magic vars 12081 1726882400.38209: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.38229: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.38241: variable 'omit' from source: magic vars 12081 1726882400.38306: variable 'omit' from source: magic vars 12081 1726882400.38412: variable 'profile' from source: include params 12081 1726882400.38423: variable 'bond_port_profile' from source: include params 12081 1726882400.38490: variable 'bond_port_profile' from source: include params 12081 1726882400.38518: variable 'omit' from source: magic vars 12081 1726882400.38566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882400.38605: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882400.38635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882400.38659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.38681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.38717: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882400.38728: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.38739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.38849: Set connection var ansible_pipelining to False 12081 1726882400.38857: Set connection var ansible_shell_type to sh 12081 1726882400.38872: Set connection var ansible_shell_executable to /bin/sh 12081 1726882400.38882: Set connection var ansible_connection to ssh 12081 1726882400.38893: Set connection var ansible_timeout to 10 12081 1726882400.38902: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882400.38931: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.38939: variable 'ansible_connection' from source: unknown 12081 1726882400.38948: variable 'ansible_module_compression' from source: unknown 12081 1726882400.38958: variable 'ansible_shell_type' from source: unknown 12081 1726882400.38966: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.38973: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.38981: variable 'ansible_pipelining' from source: unknown 12081 1726882400.38988: variable 'ansible_timeout' from source: unknown 12081 1726882400.38996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.39141: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882400.39159: variable 'omit' from source: magic vars 12081 1726882400.39173: starting attempt loop 12081 1726882400.39180: running the handler 12081 1726882400.39296: variable 'lsr_net_profile_fingerprint' from source: set_fact 12081 1726882400.39306: Evaluated conditional (lsr_net_profile_fingerprint): True 12081 1726882400.39317: handler run complete 12081 1726882400.39337: attempt loop complete, returning result 12081 1726882400.39343: _execute() done 12081 1726882400.39349: dumping result to json 12081 1726882400.39356: done dumping result, returning 12081 1726882400.39369: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [0e448fcc-3ce9-0a3f-ff3c-0000000004dc] 12081 1726882400.39380: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004dc ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882400.39544: no more pending results, returning what we have 12081 1726882400.39548: results queue empty 12081 1726882400.39549: checking for any_errors_fatal 12081 1726882400.39558: done checking for any_errors_fatal 12081 1726882400.39559: checking for max_fail_percentage 12081 1726882400.39561: done checking for max_fail_percentage 12081 1726882400.39562: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.39564: done checking to see if all hosts have failed 12081 1726882400.39565: getting the remaining hosts for this loop 12081 1726882400.39567: done getting the remaining hosts for this loop 12081 1726882400.39571: getting the next task for host managed_node3 12081 1726882400.39583: done getting next task for host managed_node3 12081 1726882400.39585: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12081 1726882400.39591: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.39597: getting variables 12081 1726882400.39599: in VariableManager get_vars() 12081 1726882400.39633: Calling all_inventory to load vars for managed_node3 12081 1726882400.39636: Calling groups_inventory to load vars for managed_node3 12081 1726882400.39640: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.39653: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.39657: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.39660: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.40682: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004dc 12081 1726882400.40686: WORKER PROCESS EXITING 12081 1726882400.41498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.43233: done with get_vars() 12081 1726882400.43261: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:20 -0400 (0:00:00.065) 0:00:20.236 ****** 12081 1726882400.43373: entering _queue_task() for managed_node3/include_tasks 12081 1726882400.43704: worker is 1 (out of 1 available) 12081 1726882400.43717: exiting _queue_task() for managed_node3/include_tasks 12081 1726882400.43729: done queuing things up, now waiting for results queue to drain 12081 1726882400.43730: waiting for pending results... 12081 1726882400.44023: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 12081 1726882400.44171: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004e0 12081 1726882400.44192: variable 'ansible_search_path' from source: unknown 12081 1726882400.44200: variable 'ansible_search_path' from source: unknown 12081 1726882400.44239: calling self._execute() 12081 1726882400.44344: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.44355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.44372: variable 'omit' from source: magic vars 12081 1726882400.44751: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.44771: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.44781: _execute() done 12081 1726882400.44788: dumping result to json 12081 1726882400.44794: done dumping result, returning 12081 1726882400.44802: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-0000000004e0] 12081 1726882400.44813: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e0 12081 1726882400.44955: no more pending results, returning what we have 12081 1726882400.44961: in VariableManager get_vars() 12081 1726882400.45002: Calling all_inventory to load vars for managed_node3 12081 1726882400.45005: Calling groups_inventory to load vars for managed_node3 12081 1726882400.45009: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.45025: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.45028: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.45030: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.46082: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e0 12081 1726882400.46085: WORKER PROCESS EXITING 12081 1726882400.46707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.48258: done with get_vars() 12081 1726882400.48275: variable 'ansible_search_path' from source: unknown 12081 1726882400.48276: variable 'ansible_search_path' from source: unknown 12081 1726882400.48303: we have included files to process 12081 1726882400.48303: generating all_blocks data 12081 1726882400.48305: done generating all_blocks data 12081 1726882400.48310: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882400.48310: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882400.48312: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882400.48925: done processing included file 12081 1726882400.48926: iterating over new_blocks loaded from include file 12081 1726882400.48927: in VariableManager get_vars() 12081 1726882400.48939: done with get_vars() 12081 1726882400.48940: filtering new block on tags 12081 1726882400.48993: done filtering new block on tags 12081 1726882400.48995: in VariableManager get_vars() 12081 1726882400.49005: done with get_vars() 12081 1726882400.49006: filtering new block on tags 12081 1726882400.49041: done filtering new block on tags 12081 1726882400.49042: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 12081 1726882400.49046: extending task lists for all hosts with included blocks 12081 1726882400.49334: done extending task lists 12081 1726882400.49335: done processing included files 12081 1726882400.49335: results queue empty 12081 1726882400.49336: checking for any_errors_fatal 12081 1726882400.49339: done checking for any_errors_fatal 12081 1726882400.49340: checking for max_fail_percentage 12081 1726882400.49340: done checking for max_fail_percentage 12081 1726882400.49341: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.49341: done checking to see if all hosts have failed 12081 1726882400.49342: getting the remaining hosts for this loop 12081 1726882400.49343: done getting the remaining hosts for this loop 12081 1726882400.49345: getting the next task for host managed_node3 12081 1726882400.49348: done getting next task for host managed_node3 12081 1726882400.49349: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12081 1726882400.49354: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.49356: getting variables 12081 1726882400.49356: in VariableManager get_vars() 12081 1726882400.49362: Calling all_inventory to load vars for managed_node3 12081 1726882400.49365: Calling groups_inventory to load vars for managed_node3 12081 1726882400.49367: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.49371: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.49372: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.49374: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.50239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.52230: done with get_vars() 12081 1726882400.52256: done getting variables 12081 1726882400.52302: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:20 -0400 (0:00:00.089) 0:00:20.326 ****** 12081 1726882400.52346: entering _queue_task() for managed_node3/set_fact 12081 1726882400.52819: worker is 1 (out of 1 available) 12081 1726882400.52830: exiting _queue_task() for managed_node3/set_fact 12081 1726882400.52846: done queuing things up, now waiting for results queue to drain 12081 1726882400.52847: waiting for pending results... 12081 1726882400.53230: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 12081 1726882400.53379: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000558 12081 1726882400.53405: variable 'ansible_search_path' from source: unknown 12081 1726882400.53413: variable 'ansible_search_path' from source: unknown 12081 1726882400.53466: calling self._execute() 12081 1726882400.53578: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.53590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.53604: variable 'omit' from source: magic vars 12081 1726882400.54029: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.54055: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.54069: variable 'omit' from source: magic vars 12081 1726882400.54133: variable 'omit' from source: magic vars 12081 1726882400.54180: variable 'omit' from source: magic vars 12081 1726882400.54232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882400.54277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882400.54314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882400.54336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.54353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.54398: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882400.54416: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.54426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.54551: Set connection var ansible_pipelining to False 12081 1726882400.54560: Set connection var ansible_shell_type to sh 12081 1726882400.54579: Set connection var ansible_shell_executable to /bin/sh 12081 1726882400.54587: Set connection var ansible_connection to ssh 12081 1726882400.54605: Set connection var ansible_timeout to 10 12081 1726882400.54616: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882400.54653: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.54666: variable 'ansible_connection' from source: unknown 12081 1726882400.54680: variable 'ansible_module_compression' from source: unknown 12081 1726882400.54690: variable 'ansible_shell_type' from source: unknown 12081 1726882400.54701: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.54743: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.54752: variable 'ansible_pipelining' from source: unknown 12081 1726882400.54760: variable 'ansible_timeout' from source: unknown 12081 1726882400.54775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.55159: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882400.55181: variable 'omit' from source: magic vars 12081 1726882400.55197: starting attempt loop 12081 1726882400.55204: running the handler 12081 1726882400.55222: handler run complete 12081 1726882400.55301: attempt loop complete, returning result 12081 1726882400.55308: _execute() done 12081 1726882400.55313: dumping result to json 12081 1726882400.55320: done dumping result, returning 12081 1726882400.55329: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-0a3f-ff3c-000000000558] 12081 1726882400.55338: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000558 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12081 1726882400.55523: no more pending results, returning what we have 12081 1726882400.55528: results queue empty 12081 1726882400.55529: checking for any_errors_fatal 12081 1726882400.55531: done checking for any_errors_fatal 12081 1726882400.55531: checking for max_fail_percentage 12081 1726882400.55533: done checking for max_fail_percentage 12081 1726882400.55534: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.55535: done checking to see if all hosts have failed 12081 1726882400.55536: getting the remaining hosts for this loop 12081 1726882400.55538: done getting the remaining hosts for this loop 12081 1726882400.55543: getting the next task for host managed_node3 12081 1726882400.55553: done getting next task for host managed_node3 12081 1726882400.55555: ^ task is: TASK: Stat profile file 12081 1726882400.55565: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.55570: getting variables 12081 1726882400.55573: in VariableManager get_vars() 12081 1726882400.55607: Calling all_inventory to load vars for managed_node3 12081 1726882400.55610: Calling groups_inventory to load vars for managed_node3 12081 1726882400.55614: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.55628: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.55631: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.55634: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.56610: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000558 12081 1726882400.56614: WORKER PROCESS EXITING 12081 1726882400.57683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.58998: done with get_vars() 12081 1726882400.59014: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:20 -0400 (0:00:00.067) 0:00:20.393 ****** 12081 1726882400.59093: entering _queue_task() for managed_node3/stat 12081 1726882400.59470: worker is 1 (out of 1 available) 12081 1726882400.59483: exiting _queue_task() for managed_node3/stat 12081 1726882400.59495: done queuing things up, now waiting for results queue to drain 12081 1726882400.59497: waiting for pending results... 12081 1726882400.59913: running TaskExecutor() for managed_node3/TASK: Stat profile file 12081 1726882400.60074: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000559 12081 1726882400.60119: variable 'ansible_search_path' from source: unknown 12081 1726882400.60130: variable 'ansible_search_path' from source: unknown 12081 1726882400.60179: calling self._execute() 12081 1726882400.60336: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.60347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.60361: variable 'omit' from source: magic vars 12081 1726882400.60782: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.60799: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.60809: variable 'omit' from source: magic vars 12081 1726882400.60881: variable 'omit' from source: magic vars 12081 1726882400.60985: variable 'profile' from source: include params 12081 1726882400.60994: variable 'bond_port_profile' from source: include params 12081 1726882400.61069: variable 'bond_port_profile' from source: include params 12081 1726882400.61101: variable 'omit' from source: magic vars 12081 1726882400.61149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882400.61191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882400.61216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882400.61243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.61259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882400.61291: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882400.61307: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.61316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.61427: Set connection var ansible_pipelining to False 12081 1726882400.61435: Set connection var ansible_shell_type to sh 12081 1726882400.61449: Set connection var ansible_shell_executable to /bin/sh 12081 1726882400.61463: Set connection var ansible_connection to ssh 12081 1726882400.61477: Set connection var ansible_timeout to 10 12081 1726882400.61487: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882400.61519: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.61525: variable 'ansible_connection' from source: unknown 12081 1726882400.61536: variable 'ansible_module_compression' from source: unknown 12081 1726882400.61542: variable 'ansible_shell_type' from source: unknown 12081 1726882400.61548: variable 'ansible_shell_executable' from source: unknown 12081 1726882400.61556: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.61565: variable 'ansible_pipelining' from source: unknown 12081 1726882400.61572: variable 'ansible_timeout' from source: unknown 12081 1726882400.61579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.61791: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882400.61807: variable 'omit' from source: magic vars 12081 1726882400.61816: starting attempt loop 12081 1726882400.61822: running the handler 12081 1726882400.61846: _low_level_execute_command(): starting 12081 1726882400.61860: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882400.62850: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882400.62867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.62883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.62902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.62944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882400.62960: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882400.62978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.62995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882400.63006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882400.63016: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882400.63029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.63043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.63081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.63097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882400.63110: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882400.63125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.63200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882400.63222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882400.63241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882400.63381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882400.65079: stdout chunk (state=3): >>>/root <<< 12081 1726882400.65276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882400.65280: stdout chunk (state=3): >>><<< 12081 1726882400.65282: stderr chunk (state=3): >>><<< 12081 1726882400.65406: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882400.65409: _low_level_execute_command(): starting 12081 1726882400.65412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121 `" && echo ansible-tmp-1726882400.6530359-12981-184396892948121="` echo /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121 `" ) && sleep 0' 12081 1726882400.66006: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882400.66020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.66035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.66057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.66102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882400.66114: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882400.66128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.66146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882400.66158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882400.66177: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882400.66189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.66201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.66215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.66226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882400.66237: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882400.66249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.66324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882400.66346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882400.66362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882400.66502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882400.68396: stdout chunk (state=3): >>>ansible-tmp-1726882400.6530359-12981-184396892948121=/root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121 <<< 12081 1726882400.68603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882400.68608: stdout chunk (state=3): >>><<< 12081 1726882400.68610: stderr chunk (state=3): >>><<< 12081 1726882400.68944: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882400.6530359-12981-184396892948121=/root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882400.68948: variable 'ansible_module_compression' from source: unknown 12081 1726882400.68950: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882400.68952: variable 'ansible_facts' from source: unknown 12081 1726882400.68954: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121/AnsiballZ_stat.py 12081 1726882400.69023: Sending initial data 12081 1726882400.69026: Sent initial data (153 bytes) 12081 1726882400.70021: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882400.70037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.70060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.70082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.70124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882400.70138: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882400.70152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.70181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882400.70194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882400.70207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882400.70219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.70234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.70252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.70271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882400.70287: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882400.70301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.70378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882400.70406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882400.70421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882400.70551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882400.72287: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882400.72382: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882400.72483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpd4j4mucm /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121/AnsiballZ_stat.py <<< 12081 1726882400.72581: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882400.73711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882400.73819: stderr chunk (state=3): >>><<< 12081 1726882400.73823: stdout chunk (state=3): >>><<< 12081 1726882400.73844: done transferring module to remote 12081 1726882400.73856: _low_level_execute_command(): starting 12081 1726882400.73865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121/ /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121/AnsiballZ_stat.py && sleep 0' 12081 1726882400.74303: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.74309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.74356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.74360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.74363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.74408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882400.74415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882400.74425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882400.74535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882400.76274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882400.76327: stderr chunk (state=3): >>><<< 12081 1726882400.76330: stdout chunk (state=3): >>><<< 12081 1726882400.76345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882400.76348: _low_level_execute_command(): starting 12081 1726882400.76353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121/AnsiballZ_stat.py && sleep 0' 12081 1726882400.76818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.76822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.76862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.76865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882400.76868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882400.76870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.76923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882400.76926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882400.76930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882400.77038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882400.90194: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882400.91172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882400.91232: stderr chunk (state=3): >>><<< 12081 1726882400.91236: stdout chunk (state=3): >>><<< 12081 1726882400.91252: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882400.91279: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882400.91287: _low_level_execute_command(): starting 12081 1726882400.91292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882400.6530359-12981-184396892948121/ > /dev/null 2>&1 && sleep 0' 12081 1726882400.91741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882400.91758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882400.91785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882400.91798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882400.91844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882400.91856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882400.91870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882400.91985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882400.93826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882400.93883: stderr chunk (state=3): >>><<< 12081 1726882400.93890: stdout chunk (state=3): >>><<< 12081 1726882400.93906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882400.93911: handler run complete 12081 1726882400.93927: attempt loop complete, returning result 12081 1726882400.93930: _execute() done 12081 1726882400.93933: dumping result to json 12081 1726882400.93935: done dumping result, returning 12081 1726882400.93942: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0e448fcc-3ce9-0a3f-ff3c-000000000559] 12081 1726882400.93948: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000559 12081 1726882400.94046: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000559 12081 1726882400.94049: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12081 1726882400.94104: no more pending results, returning what we have 12081 1726882400.94108: results queue empty 12081 1726882400.94109: checking for any_errors_fatal 12081 1726882400.94119: done checking for any_errors_fatal 12081 1726882400.94120: checking for max_fail_percentage 12081 1726882400.94122: done checking for max_fail_percentage 12081 1726882400.94123: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.94124: done checking to see if all hosts have failed 12081 1726882400.94125: getting the remaining hosts for this loop 12081 1726882400.94126: done getting the remaining hosts for this loop 12081 1726882400.94130: getting the next task for host managed_node3 12081 1726882400.94136: done getting next task for host managed_node3 12081 1726882400.94139: ^ task is: TASK: Set NM profile exist flag based on the profile files 12081 1726882400.94145: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.94149: getting variables 12081 1726882400.94151: in VariableManager get_vars() 12081 1726882400.94183: Calling all_inventory to load vars for managed_node3 12081 1726882400.94186: Calling groups_inventory to load vars for managed_node3 12081 1726882400.94190: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.94202: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.94203: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.94206: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.95031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.95997: done with get_vars() 12081 1726882400.96026: done getting variables 12081 1726882400.96086: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:20 -0400 (0:00:00.370) 0:00:20.764 ****** 12081 1726882400.96119: entering _queue_task() for managed_node3/set_fact 12081 1726882400.96432: worker is 1 (out of 1 available) 12081 1726882400.96444: exiting _queue_task() for managed_node3/set_fact 12081 1726882400.96456: done queuing things up, now waiting for results queue to drain 12081 1726882400.96457: waiting for pending results... 12081 1726882400.96744: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 12081 1726882400.96887: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000055a 12081 1726882400.96912: variable 'ansible_search_path' from source: unknown 12081 1726882400.96920: variable 'ansible_search_path' from source: unknown 12081 1726882400.96960: calling self._execute() 12081 1726882400.97061: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882400.97074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882400.97088: variable 'omit' from source: magic vars 12081 1726882400.97430: variable 'ansible_distribution_major_version' from source: facts 12081 1726882400.97442: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882400.97542: variable 'profile_stat' from source: set_fact 12081 1726882400.97553: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882400.97557: when evaluation is False, skipping this task 12081 1726882400.97559: _execute() done 12081 1726882400.97562: dumping result to json 12081 1726882400.97566: done dumping result, returning 12081 1726882400.97569: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-0a3f-ff3c-00000000055a] 12081 1726882400.97577: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055a 12081 1726882400.97660: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055a 12081 1726882400.97663: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882400.97709: no more pending results, returning what we have 12081 1726882400.97714: results queue empty 12081 1726882400.97715: checking for any_errors_fatal 12081 1726882400.97724: done checking for any_errors_fatal 12081 1726882400.97725: checking for max_fail_percentage 12081 1726882400.97727: done checking for max_fail_percentage 12081 1726882400.97729: checking to see if all hosts have failed and the running result is not ok 12081 1726882400.97730: done checking to see if all hosts have failed 12081 1726882400.97730: getting the remaining hosts for this loop 12081 1726882400.97732: done getting the remaining hosts for this loop 12081 1726882400.97735: getting the next task for host managed_node3 12081 1726882400.97742: done getting next task for host managed_node3 12081 1726882400.97745: ^ task is: TASK: Get NM profile info 12081 1726882400.97753: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882400.97758: getting variables 12081 1726882400.97759: in VariableManager get_vars() 12081 1726882400.97786: Calling all_inventory to load vars for managed_node3 12081 1726882400.97789: Calling groups_inventory to load vars for managed_node3 12081 1726882400.97791: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882400.97802: Calling all_plugins_play to load vars for managed_node3 12081 1726882400.97804: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882400.97807: Calling groups_plugins_play to load vars for managed_node3 12081 1726882400.98737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882400.99830: done with get_vars() 12081 1726882400.99846: done getting variables 12081 1726882400.99893: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:20 -0400 (0:00:00.038) 0:00:20.802 ****** 12081 1726882400.99919: entering _queue_task() for managed_node3/shell 12081 1726882401.00215: worker is 1 (out of 1 available) 12081 1726882401.00227: exiting _queue_task() for managed_node3/shell 12081 1726882401.00239: done queuing things up, now waiting for results queue to drain 12081 1726882401.00240: waiting for pending results... 12081 1726882401.00507: running TaskExecutor() for managed_node3/TASK: Get NM profile info 12081 1726882401.00631: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000055b 12081 1726882401.00648: variable 'ansible_search_path' from source: unknown 12081 1726882401.00656: variable 'ansible_search_path' from source: unknown 12081 1726882401.00698: calling self._execute() 12081 1726882401.00793: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.00797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.00806: variable 'omit' from source: magic vars 12081 1726882401.01080: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.01091: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.01099: variable 'omit' from source: magic vars 12081 1726882401.01139: variable 'omit' from source: magic vars 12081 1726882401.01211: variable 'profile' from source: include params 12081 1726882401.01216: variable 'bond_port_profile' from source: include params 12081 1726882401.01265: variable 'bond_port_profile' from source: include params 12081 1726882401.01279: variable 'omit' from source: magic vars 12081 1726882401.01317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882401.01342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882401.01359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882401.01376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.01385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.01408: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882401.01411: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.01413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.01486: Set connection var ansible_pipelining to False 12081 1726882401.01490: Set connection var ansible_shell_type to sh 12081 1726882401.01495: Set connection var ansible_shell_executable to /bin/sh 12081 1726882401.01498: Set connection var ansible_connection to ssh 12081 1726882401.01503: Set connection var ansible_timeout to 10 12081 1726882401.01508: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882401.01526: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.01530: variable 'ansible_connection' from source: unknown 12081 1726882401.01533: variable 'ansible_module_compression' from source: unknown 12081 1726882401.01535: variable 'ansible_shell_type' from source: unknown 12081 1726882401.01537: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.01541: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.01543: variable 'ansible_pipelining' from source: unknown 12081 1726882401.01545: variable 'ansible_timeout' from source: unknown 12081 1726882401.01547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.01644: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882401.01652: variable 'omit' from source: magic vars 12081 1726882401.01663: starting attempt loop 12081 1726882401.01668: running the handler 12081 1726882401.01676: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882401.01692: _low_level_execute_command(): starting 12081 1726882401.01698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882401.02219: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882401.02247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.02262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882401.02276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.02315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882401.02330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882401.02340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882401.02449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882401.04059: stdout chunk (state=3): >>>/root <<< 12081 1726882401.04160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882401.04215: stderr chunk (state=3): >>><<< 12081 1726882401.04219: stdout chunk (state=3): >>><<< 12081 1726882401.04238: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882401.04248: _low_level_execute_command(): starting 12081 1726882401.04256: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483 `" && echo ansible-tmp-1726882401.042367-12998-193021813391483="` echo /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483 `" ) && sleep 0' 12081 1726882401.04701: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882401.04713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882401.04740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.04752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.04810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882401.04828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882401.04932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882401.06794: stdout chunk (state=3): >>>ansible-tmp-1726882401.042367-12998-193021813391483=/root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483 <<< 12081 1726882401.07075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882401.07079: stdout chunk (state=3): >>><<< 12081 1726882401.07081: stderr chunk (state=3): >>><<< 12081 1726882401.07084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882401.042367-12998-193021813391483=/root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882401.07086: variable 'ansible_module_compression' from source: unknown 12081 1726882401.07088: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882401.07184: variable 'ansible_facts' from source: unknown 12081 1726882401.07194: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483/AnsiballZ_command.py 12081 1726882401.07331: Sending initial data 12081 1726882401.07335: Sent initial data (155 bytes) 12081 1726882401.08222: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882401.08230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882401.08239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882401.08254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882401.08293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882401.08299: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882401.08308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.08320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882401.08328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882401.08334: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882401.08341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882401.08350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882401.08361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882401.08370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882401.08377: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882401.08385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.08456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882401.08472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882401.08483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882401.08605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882401.10338: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882401.10435: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882401.10535: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpfntoynq9 /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483/AnsiballZ_command.py <<< 12081 1726882401.10629: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882401.11656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882401.11769: stderr chunk (state=3): >>><<< 12081 1726882401.11772: stdout chunk (state=3): >>><<< 12081 1726882401.11789: done transferring module to remote 12081 1726882401.11799: _low_level_execute_command(): starting 12081 1726882401.11803: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483/ /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483/AnsiballZ_command.py && sleep 0' 12081 1726882401.12267: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882401.12271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882401.12285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882401.12334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882401.12341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882401.12344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882401.12346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.12389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882401.12403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882401.12515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882401.14251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882401.14307: stderr chunk (state=3): >>><<< 12081 1726882401.14311: stdout chunk (state=3): >>><<< 12081 1726882401.14325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882401.14328: _low_level_execute_command(): starting 12081 1726882401.14333: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483/AnsiballZ_command.py && sleep 0' 12081 1726882401.14790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882401.14803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882401.14828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.14839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.14890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882401.14902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882401.15023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882401.30766: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:21.283032", "end": "2024-09-20 21:33:21.306243", "delta": "0:00:00.023211", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882401.32144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882401.32177: stderr chunk (state=3): >>><<< 12081 1726882401.32180: stdout chunk (state=3): >>><<< 12081 1726882401.32324: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:21.283032", "end": "2024-09-20 21:33:21.306243", "delta": "0:00:00.023211", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882401.32328: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882401.32330: _low_level_execute_command(): starting 12081 1726882401.32333: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882401.042367-12998-193021813391483/ > /dev/null 2>&1 && sleep 0' 12081 1726882401.32935: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882401.32954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882401.32973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882401.32992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882401.33036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882401.33049: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882401.33132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.33147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882401.33265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882401.33419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882401.35275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882401.35377: stderr chunk (state=3): >>><<< 12081 1726882401.35389: stdout chunk (state=3): >>><<< 12081 1726882401.35571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882401.35575: handler run complete 12081 1726882401.35577: Evaluated conditional (False): False 12081 1726882401.35579: attempt loop complete, returning result 12081 1726882401.35581: _execute() done 12081 1726882401.35584: dumping result to json 12081 1726882401.35586: done dumping result, returning 12081 1726882401.35588: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0e448fcc-3ce9-0a3f-ff3c-00000000055b] 12081 1726882401.35590: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055b 12081 1726882401.35669: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055b 12081 1726882401.35672: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023211", "end": "2024-09-20 21:33:21.306243", "rc": 0, "start": "2024-09-20 21:33:21.283032" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 12081 1726882401.35754: no more pending results, returning what we have 12081 1726882401.35759: results queue empty 12081 1726882401.35759: checking for any_errors_fatal 12081 1726882401.35772: done checking for any_errors_fatal 12081 1726882401.35773: checking for max_fail_percentage 12081 1726882401.35776: done checking for max_fail_percentage 12081 1726882401.35777: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.35778: done checking to see if all hosts have failed 12081 1726882401.35779: getting the remaining hosts for this loop 12081 1726882401.35781: done getting the remaining hosts for this loop 12081 1726882401.35785: getting the next task for host managed_node3 12081 1726882401.35794: done getting next task for host managed_node3 12081 1726882401.35797: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12081 1726882401.35804: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.35809: getting variables 12081 1726882401.35811: in VariableManager get_vars() 12081 1726882401.35844: Calling all_inventory to load vars for managed_node3 12081 1726882401.35847: Calling groups_inventory to load vars for managed_node3 12081 1726882401.35854: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.35869: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.35872: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.35875: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.37906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.39805: done with get_vars() 12081 1726882401.39830: done getting variables 12081 1726882401.39900: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:21 -0400 (0:00:00.400) 0:00:21.202 ****** 12081 1726882401.39937: entering _queue_task() for managed_node3/set_fact 12081 1726882401.40285: worker is 1 (out of 1 available) 12081 1726882401.40300: exiting _queue_task() for managed_node3/set_fact 12081 1726882401.40314: done queuing things up, now waiting for results queue to drain 12081 1726882401.40315: waiting for pending results... 12081 1726882401.40623: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12081 1726882401.40778: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000055c 12081 1726882401.40798: variable 'ansible_search_path' from source: unknown 12081 1726882401.40805: variable 'ansible_search_path' from source: unknown 12081 1726882401.40847: calling self._execute() 12081 1726882401.40958: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.40973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.40992: variable 'omit' from source: magic vars 12081 1726882401.41373: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.41394: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.41540: variable 'nm_profile_exists' from source: set_fact 12081 1726882401.41557: Evaluated conditional (nm_profile_exists.rc == 0): True 12081 1726882401.41568: variable 'omit' from source: magic vars 12081 1726882401.41624: variable 'omit' from source: magic vars 12081 1726882401.41665: variable 'omit' from source: magic vars 12081 1726882401.41709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882401.41747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882401.41778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882401.41803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.41820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.41859: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882401.41870: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.41878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.41985: Set connection var ansible_pipelining to False 12081 1726882401.41992: Set connection var ansible_shell_type to sh 12081 1726882401.42004: Set connection var ansible_shell_executable to /bin/sh 12081 1726882401.42010: Set connection var ansible_connection to ssh 12081 1726882401.42019: Set connection var ansible_timeout to 10 12081 1726882401.42030: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882401.42065: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.42079: variable 'ansible_connection' from source: unknown 12081 1726882401.42086: variable 'ansible_module_compression' from source: unknown 12081 1726882401.42091: variable 'ansible_shell_type' from source: unknown 12081 1726882401.42097: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.42102: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.42109: variable 'ansible_pipelining' from source: unknown 12081 1726882401.42115: variable 'ansible_timeout' from source: unknown 12081 1726882401.42122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.42268: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882401.42287: variable 'omit' from source: magic vars 12081 1726882401.42298: starting attempt loop 12081 1726882401.42305: running the handler 12081 1726882401.42323: handler run complete 12081 1726882401.42337: attempt loop complete, returning result 12081 1726882401.42343: _execute() done 12081 1726882401.42349: dumping result to json 12081 1726882401.42359: done dumping result, returning 12081 1726882401.42372: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-0a3f-ff3c-00000000055c] 12081 1726882401.42382: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055c 12081 1726882401.42493: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055c 12081 1726882401.42500: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12081 1726882401.42562: no more pending results, returning what we have 12081 1726882401.42568: results queue empty 12081 1726882401.42569: checking for any_errors_fatal 12081 1726882401.42580: done checking for any_errors_fatal 12081 1726882401.42581: checking for max_fail_percentage 12081 1726882401.42583: done checking for max_fail_percentage 12081 1726882401.42584: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.42585: done checking to see if all hosts have failed 12081 1726882401.42586: getting the remaining hosts for this loop 12081 1726882401.42588: done getting the remaining hosts for this loop 12081 1726882401.42592: getting the next task for host managed_node3 12081 1726882401.42603: done getting next task for host managed_node3 12081 1726882401.42605: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12081 1726882401.42612: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.42616: getting variables 12081 1726882401.42618: in VariableManager get_vars() 12081 1726882401.42649: Calling all_inventory to load vars for managed_node3 12081 1726882401.42654: Calling groups_inventory to load vars for managed_node3 12081 1726882401.42658: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.42672: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.42675: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.42680: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.44374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.46106: done with get_vars() 12081 1726882401.46143: done getting variables 12081 1726882401.46212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882401.46342: variable 'profile' from source: include params 12081 1726882401.46346: variable 'bond_port_profile' from source: include params 12081 1726882401.46410: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:21 -0400 (0:00:00.065) 0:00:21.267 ****** 12081 1726882401.46446: entering _queue_task() for managed_node3/command 12081 1726882401.46807: worker is 1 (out of 1 available) 12081 1726882401.46821: exiting _queue_task() for managed_node3/command 12081 1726882401.46833: done queuing things up, now waiting for results queue to drain 12081 1726882401.46834: waiting for pending results... 12081 1726882401.47129: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 12081 1726882401.47273: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000055e 12081 1726882401.47299: variable 'ansible_search_path' from source: unknown 12081 1726882401.47308: variable 'ansible_search_path' from source: unknown 12081 1726882401.47353: calling self._execute() 12081 1726882401.47459: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.47473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.47487: variable 'omit' from source: magic vars 12081 1726882401.47825: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.47839: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.47939: variable 'profile_stat' from source: set_fact 12081 1726882401.47948: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882401.47954: when evaluation is False, skipping this task 12081 1726882401.47957: _execute() done 12081 1726882401.47959: dumping result to json 12081 1726882401.47961: done dumping result, returning 12081 1726882401.47966: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-0a3f-ff3c-00000000055e] 12081 1726882401.47974: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055e 12081 1726882401.48059: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055e 12081 1726882401.48061: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882401.48113: no more pending results, returning what we have 12081 1726882401.48116: results queue empty 12081 1726882401.48117: checking for any_errors_fatal 12081 1726882401.48128: done checking for any_errors_fatal 12081 1726882401.48128: checking for max_fail_percentage 12081 1726882401.48130: done checking for max_fail_percentage 12081 1726882401.48131: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.48132: done checking to see if all hosts have failed 12081 1726882401.48133: getting the remaining hosts for this loop 12081 1726882401.48135: done getting the remaining hosts for this loop 12081 1726882401.48138: getting the next task for host managed_node3 12081 1726882401.48146: done getting next task for host managed_node3 12081 1726882401.48149: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12081 1726882401.48158: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.48161: getting variables 12081 1726882401.48164: in VariableManager get_vars() 12081 1726882401.48195: Calling all_inventory to load vars for managed_node3 12081 1726882401.48197: Calling groups_inventory to load vars for managed_node3 12081 1726882401.48200: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.48211: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.48214: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.48216: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.49173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.50633: done with get_vars() 12081 1726882401.50668: done getting variables 12081 1726882401.50736: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882401.50841: variable 'profile' from source: include params 12081 1726882401.50845: variable 'bond_port_profile' from source: include params 12081 1726882401.50894: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:21 -0400 (0:00:00.044) 0:00:21.312 ****** 12081 1726882401.50918: entering _queue_task() for managed_node3/set_fact 12081 1726882401.51164: worker is 1 (out of 1 available) 12081 1726882401.51177: exiting _queue_task() for managed_node3/set_fact 12081 1726882401.51189: done queuing things up, now waiting for results queue to drain 12081 1726882401.51191: waiting for pending results... 12081 1726882401.51367: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 12081 1726882401.51454: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000055f 12081 1726882401.51466: variable 'ansible_search_path' from source: unknown 12081 1726882401.51470: variable 'ansible_search_path' from source: unknown 12081 1726882401.51500: calling self._execute() 12081 1726882401.51576: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.51580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.51589: variable 'omit' from source: magic vars 12081 1726882401.51842: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.51856: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.51939: variable 'profile_stat' from source: set_fact 12081 1726882401.51950: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882401.51958: when evaluation is False, skipping this task 12081 1726882401.51960: _execute() done 12081 1726882401.51963: dumping result to json 12081 1726882401.51967: done dumping result, returning 12081 1726882401.51970: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-0a3f-ff3c-00000000055f] 12081 1726882401.51972: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055f 12081 1726882401.52058: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000055f 12081 1726882401.52061: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882401.52126: no more pending results, returning what we have 12081 1726882401.52130: results queue empty 12081 1726882401.52131: checking for any_errors_fatal 12081 1726882401.52136: done checking for any_errors_fatal 12081 1726882401.52137: checking for max_fail_percentage 12081 1726882401.52138: done checking for max_fail_percentage 12081 1726882401.52139: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.52140: done checking to see if all hosts have failed 12081 1726882401.52141: getting the remaining hosts for this loop 12081 1726882401.52143: done getting the remaining hosts for this loop 12081 1726882401.52146: getting the next task for host managed_node3 12081 1726882401.52157: done getting next task for host managed_node3 12081 1726882401.52159: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12081 1726882401.52167: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.52171: getting variables 12081 1726882401.52172: in VariableManager get_vars() 12081 1726882401.52201: Calling all_inventory to load vars for managed_node3 12081 1726882401.52204: Calling groups_inventory to load vars for managed_node3 12081 1726882401.52207: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.52217: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.52219: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.52222: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.53228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.54876: done with get_vars() 12081 1726882401.54899: done getting variables 12081 1726882401.54945: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882401.55042: variable 'profile' from source: include params 12081 1726882401.55045: variable 'bond_port_profile' from source: include params 12081 1726882401.55089: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:21 -0400 (0:00:00.041) 0:00:21.354 ****** 12081 1726882401.55114: entering _queue_task() for managed_node3/command 12081 1726882401.55348: worker is 1 (out of 1 available) 12081 1726882401.55365: exiting _queue_task() for managed_node3/command 12081 1726882401.55377: done queuing things up, now waiting for results queue to drain 12081 1726882401.55379: waiting for pending results... 12081 1726882401.55553: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 12081 1726882401.55643: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000560 12081 1726882401.55657: variable 'ansible_search_path' from source: unknown 12081 1726882401.55661: variable 'ansible_search_path' from source: unknown 12081 1726882401.55693: calling self._execute() 12081 1726882401.55767: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.55771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.55779: variable 'omit' from source: magic vars 12081 1726882401.56034: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.56046: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.56130: variable 'profile_stat' from source: set_fact 12081 1726882401.56139: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882401.56142: when evaluation is False, skipping this task 12081 1726882401.56147: _execute() done 12081 1726882401.56150: dumping result to json 12081 1726882401.56156: done dumping result, returning 12081 1726882401.56159: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-0a3f-ff3c-000000000560] 12081 1726882401.56167: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000560 12081 1726882401.56248: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000560 12081 1726882401.56253: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882401.56314: no more pending results, returning what we have 12081 1726882401.56318: results queue empty 12081 1726882401.56319: checking for any_errors_fatal 12081 1726882401.56327: done checking for any_errors_fatal 12081 1726882401.56328: checking for max_fail_percentage 12081 1726882401.56330: done checking for max_fail_percentage 12081 1726882401.56331: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.56332: done checking to see if all hosts have failed 12081 1726882401.56333: getting the remaining hosts for this loop 12081 1726882401.56335: done getting the remaining hosts for this loop 12081 1726882401.56338: getting the next task for host managed_node3 12081 1726882401.56346: done getting next task for host managed_node3 12081 1726882401.56349: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12081 1726882401.56357: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.56361: getting variables 12081 1726882401.56366: in VariableManager get_vars() 12081 1726882401.56396: Calling all_inventory to load vars for managed_node3 12081 1726882401.56398: Calling groups_inventory to load vars for managed_node3 12081 1726882401.56401: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.56412: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.56415: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.56417: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.57802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.60057: done with get_vars() 12081 1726882401.60101: done getting variables 12081 1726882401.60174: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882401.60339: variable 'profile' from source: include params 12081 1726882401.60343: variable 'bond_port_profile' from source: include params 12081 1726882401.60424: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:21 -0400 (0:00:00.053) 0:00:21.407 ****** 12081 1726882401.60467: entering _queue_task() for managed_node3/set_fact 12081 1726882401.60827: worker is 1 (out of 1 available) 12081 1726882401.60838: exiting _queue_task() for managed_node3/set_fact 12081 1726882401.60853: done queuing things up, now waiting for results queue to drain 12081 1726882401.60855: waiting for pending results... 12081 1726882401.61182: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 12081 1726882401.61318: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000561 12081 1726882401.61332: variable 'ansible_search_path' from source: unknown 12081 1726882401.61336: variable 'ansible_search_path' from source: unknown 12081 1726882401.61383: calling self._execute() 12081 1726882401.61489: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.61493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.61503: variable 'omit' from source: magic vars 12081 1726882401.61895: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.61909: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.62245: variable 'profile_stat' from source: set_fact 12081 1726882401.62258: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882401.62262: when evaluation is False, skipping this task 12081 1726882401.62267: _execute() done 12081 1726882401.62270: dumping result to json 12081 1726882401.62272: done dumping result, returning 12081 1726882401.62288: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-0a3f-ff3c-000000000561] 12081 1726882401.62295: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000561 12081 1726882401.62394: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000561 12081 1726882401.62397: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882401.62449: no more pending results, returning what we have 12081 1726882401.62457: results queue empty 12081 1726882401.62458: checking for any_errors_fatal 12081 1726882401.62468: done checking for any_errors_fatal 12081 1726882401.62469: checking for max_fail_percentage 12081 1726882401.62471: done checking for max_fail_percentage 12081 1726882401.62472: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.62474: done checking to see if all hosts have failed 12081 1726882401.62474: getting the remaining hosts for this loop 12081 1726882401.62476: done getting the remaining hosts for this loop 12081 1726882401.62480: getting the next task for host managed_node3 12081 1726882401.62492: done getting next task for host managed_node3 12081 1726882401.62496: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12081 1726882401.62502: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.62508: getting variables 12081 1726882401.62510: in VariableManager get_vars() 12081 1726882401.62543: Calling all_inventory to load vars for managed_node3 12081 1726882401.62546: Calling groups_inventory to load vars for managed_node3 12081 1726882401.62553: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.62573: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.62577: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.62580: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.64494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.66482: done with get_vars() 12081 1726882401.66509: done getting variables 12081 1726882401.66576: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882401.66705: variable 'profile' from source: include params 12081 1726882401.66714: variable 'bond_port_profile' from source: include params 12081 1726882401.66780: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:21 -0400 (0:00:00.063) 0:00:21.471 ****** 12081 1726882401.66814: entering _queue_task() for managed_node3/assert 12081 1726882401.67171: worker is 1 (out of 1 available) 12081 1726882401.67183: exiting _queue_task() for managed_node3/assert 12081 1726882401.67195: done queuing things up, now waiting for results queue to drain 12081 1726882401.67196: waiting for pending results... 12081 1726882401.67500: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 12081 1726882401.67617: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004e1 12081 1726882401.67630: variable 'ansible_search_path' from source: unknown 12081 1726882401.67636: variable 'ansible_search_path' from source: unknown 12081 1726882401.67682: calling self._execute() 12081 1726882401.67786: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.67790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.67801: variable 'omit' from source: magic vars 12081 1726882401.68193: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.68207: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.68214: variable 'omit' from source: magic vars 12081 1726882401.68279: variable 'omit' from source: magic vars 12081 1726882401.68388: variable 'profile' from source: include params 12081 1726882401.68392: variable 'bond_port_profile' from source: include params 12081 1726882401.68468: variable 'bond_port_profile' from source: include params 12081 1726882401.68488: variable 'omit' from source: magic vars 12081 1726882401.68534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882401.68573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882401.68596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882401.68613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.68629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.68661: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882401.68667: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.68669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.68797: Set connection var ansible_pipelining to False 12081 1726882401.68800: Set connection var ansible_shell_type to sh 12081 1726882401.68809: Set connection var ansible_shell_executable to /bin/sh 12081 1726882401.68812: Set connection var ansible_connection to ssh 12081 1726882401.68817: Set connection var ansible_timeout to 10 12081 1726882401.68822: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882401.68854: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.68859: variable 'ansible_connection' from source: unknown 12081 1726882401.68862: variable 'ansible_module_compression' from source: unknown 12081 1726882401.68867: variable 'ansible_shell_type' from source: unknown 12081 1726882401.68869: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.68873: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.68878: variable 'ansible_pipelining' from source: unknown 12081 1726882401.68881: variable 'ansible_timeout' from source: unknown 12081 1726882401.68883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.69029: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882401.69040: variable 'omit' from source: magic vars 12081 1726882401.69045: starting attempt loop 12081 1726882401.69048: running the handler 12081 1726882401.69175: variable 'lsr_net_profile_exists' from source: set_fact 12081 1726882401.69182: Evaluated conditional (lsr_net_profile_exists): True 12081 1726882401.69187: handler run complete 12081 1726882401.69203: attempt loop complete, returning result 12081 1726882401.69206: _execute() done 12081 1726882401.69208: dumping result to json 12081 1726882401.69211: done dumping result, returning 12081 1726882401.69218: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [0e448fcc-3ce9-0a3f-ff3c-0000000004e1] 12081 1726882401.69230: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e1 12081 1726882401.69321: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e1 12081 1726882401.69324: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882401.69382: no more pending results, returning what we have 12081 1726882401.69387: results queue empty 12081 1726882401.69387: checking for any_errors_fatal 12081 1726882401.69397: done checking for any_errors_fatal 12081 1726882401.69397: checking for max_fail_percentage 12081 1726882401.69399: done checking for max_fail_percentage 12081 1726882401.69400: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.69402: done checking to see if all hosts have failed 12081 1726882401.69402: getting the remaining hosts for this loop 12081 1726882401.69404: done getting the remaining hosts for this loop 12081 1726882401.69409: getting the next task for host managed_node3 12081 1726882401.69418: done getting next task for host managed_node3 12081 1726882401.69421: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12081 1726882401.69425: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.69430: getting variables 12081 1726882401.69432: in VariableManager get_vars() 12081 1726882401.69471: Calling all_inventory to load vars for managed_node3 12081 1726882401.69475: Calling groups_inventory to load vars for managed_node3 12081 1726882401.69479: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.69492: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.69496: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.69499: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.71274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.73080: done with get_vars() 12081 1726882401.73108: done getting variables 12081 1726882401.73179: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882401.73312: variable 'profile' from source: include params 12081 1726882401.73317: variable 'bond_port_profile' from source: include params 12081 1726882401.73386: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:21 -0400 (0:00:00.066) 0:00:21.537 ****** 12081 1726882401.73420: entering _queue_task() for managed_node3/assert 12081 1726882401.73770: worker is 1 (out of 1 available) 12081 1726882401.73783: exiting _queue_task() for managed_node3/assert 12081 1726882401.73796: done queuing things up, now waiting for results queue to drain 12081 1726882401.73798: waiting for pending results... 12081 1726882401.74111: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 12081 1726882401.74229: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004e2 12081 1726882401.74247: variable 'ansible_search_path' from source: unknown 12081 1726882401.74254: variable 'ansible_search_path' from source: unknown 12081 1726882401.74296: calling self._execute() 12081 1726882401.74394: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.74398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.74409: variable 'omit' from source: magic vars 12081 1726882401.74786: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.74809: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.74815: variable 'omit' from source: magic vars 12081 1726882401.74876: variable 'omit' from source: magic vars 12081 1726882401.74983: variable 'profile' from source: include params 12081 1726882401.74987: variable 'bond_port_profile' from source: include params 12081 1726882401.75062: variable 'bond_port_profile' from source: include params 12081 1726882401.75083: variable 'omit' from source: magic vars 12081 1726882401.75127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882401.75167: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882401.75187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882401.75204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.75216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.75252: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882401.75258: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.75261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.75378: Set connection var ansible_pipelining to False 12081 1726882401.75381: Set connection var ansible_shell_type to sh 12081 1726882401.75390: Set connection var ansible_shell_executable to /bin/sh 12081 1726882401.75393: Set connection var ansible_connection to ssh 12081 1726882401.75398: Set connection var ansible_timeout to 10 12081 1726882401.75403: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882401.75427: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.75430: variable 'ansible_connection' from source: unknown 12081 1726882401.75433: variable 'ansible_module_compression' from source: unknown 12081 1726882401.75435: variable 'ansible_shell_type' from source: unknown 12081 1726882401.75437: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.75440: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.75451: variable 'ansible_pipelining' from source: unknown 12081 1726882401.75462: variable 'ansible_timeout' from source: unknown 12081 1726882401.75466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.75617: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882401.75628: variable 'omit' from source: magic vars 12081 1726882401.75633: starting attempt loop 12081 1726882401.75636: running the handler 12081 1726882401.75754: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12081 1726882401.75762: Evaluated conditional (lsr_net_profile_ansible_managed): True 12081 1726882401.75774: handler run complete 12081 1726882401.75796: attempt loop complete, returning result 12081 1726882401.75799: _execute() done 12081 1726882401.75802: dumping result to json 12081 1726882401.75804: done dumping result, returning 12081 1726882401.75811: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0e448fcc-3ce9-0a3f-ff3c-0000000004e2] 12081 1726882401.75819: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e2 12081 1726882401.75907: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e2 12081 1726882401.75910: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882401.75970: no more pending results, returning what we have 12081 1726882401.75974: results queue empty 12081 1726882401.75975: checking for any_errors_fatal 12081 1726882401.75983: done checking for any_errors_fatal 12081 1726882401.75984: checking for max_fail_percentage 12081 1726882401.75986: done checking for max_fail_percentage 12081 1726882401.75987: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.75988: done checking to see if all hosts have failed 12081 1726882401.75989: getting the remaining hosts for this loop 12081 1726882401.75991: done getting the remaining hosts for this loop 12081 1726882401.75995: getting the next task for host managed_node3 12081 1726882401.76003: done getting next task for host managed_node3 12081 1726882401.76006: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12081 1726882401.76011: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.76015: getting variables 12081 1726882401.76017: in VariableManager get_vars() 12081 1726882401.76050: Calling all_inventory to load vars for managed_node3 12081 1726882401.76056: Calling groups_inventory to load vars for managed_node3 12081 1726882401.76059: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.76074: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.76077: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.76080: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.78061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.81177: done with get_vars() 12081 1726882401.81208: done getting variables 12081 1726882401.81279: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882401.81406: variable 'profile' from source: include params 12081 1726882401.81410: variable 'bond_port_profile' from source: include params 12081 1726882401.81479: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:21 -0400 (0:00:00.080) 0:00:21.618 ****** 12081 1726882401.81512: entering _queue_task() for managed_node3/assert 12081 1726882401.81848: worker is 1 (out of 1 available) 12081 1726882401.81863: exiting _queue_task() for managed_node3/assert 12081 1726882401.81878: done queuing things up, now waiting for results queue to drain 12081 1726882401.81880: waiting for pending results... 12081 1726882401.82180: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 12081 1726882401.82292: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004e3 12081 1726882401.82307: variable 'ansible_search_path' from source: unknown 12081 1726882401.82310: variable 'ansible_search_path' from source: unknown 12081 1726882401.82353: calling self._execute() 12081 1726882401.82449: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.82458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.82468: variable 'omit' from source: magic vars 12081 1726882401.84178: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.84191: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.84198: variable 'omit' from source: magic vars 12081 1726882401.84377: variable 'omit' from source: magic vars 12081 1726882401.84602: variable 'profile' from source: include params 12081 1726882401.84606: variable 'bond_port_profile' from source: include params 12081 1726882401.84787: variable 'bond_port_profile' from source: include params 12081 1726882401.84808: variable 'omit' from source: magic vars 12081 1726882401.84851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882401.85006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882401.85027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882401.85045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.85059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882401.85205: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882401.85208: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.85210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.85431: Set connection var ansible_pipelining to False 12081 1726882401.85434: Set connection var ansible_shell_type to sh 12081 1726882401.85443: Set connection var ansible_shell_executable to /bin/sh 12081 1726882401.85446: Set connection var ansible_connection to ssh 12081 1726882401.85451: Set connection var ansible_timeout to 10 12081 1726882401.85460: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882401.85491: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.85494: variable 'ansible_connection' from source: unknown 12081 1726882401.85497: variable 'ansible_module_compression' from source: unknown 12081 1726882401.85499: variable 'ansible_shell_type' from source: unknown 12081 1726882401.85501: variable 'ansible_shell_executable' from source: unknown 12081 1726882401.85503: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.85505: variable 'ansible_pipelining' from source: unknown 12081 1726882401.85508: variable 'ansible_timeout' from source: unknown 12081 1726882401.85513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.85891: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882401.85903: variable 'omit' from source: magic vars 12081 1726882401.85909: starting attempt loop 12081 1726882401.85911: running the handler 12081 1726882401.86150: variable 'lsr_net_profile_fingerprint' from source: set_fact 12081 1726882401.86158: Evaluated conditional (lsr_net_profile_fingerprint): True 12081 1726882401.86168: handler run complete 12081 1726882401.86329: attempt loop complete, returning result 12081 1726882401.86332: _execute() done 12081 1726882401.86335: dumping result to json 12081 1726882401.86337: done dumping result, returning 12081 1726882401.86339: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0e448fcc-3ce9-0a3f-ff3c-0000000004e3] 12081 1726882401.86341: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e3 12081 1726882401.86409: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e3 12081 1726882401.86413: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882401.86472: no more pending results, returning what we have 12081 1726882401.86477: results queue empty 12081 1726882401.86478: checking for any_errors_fatal 12081 1726882401.86487: done checking for any_errors_fatal 12081 1726882401.86488: checking for max_fail_percentage 12081 1726882401.86490: done checking for max_fail_percentage 12081 1726882401.86491: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.86492: done checking to see if all hosts have failed 12081 1726882401.86493: getting the remaining hosts for this loop 12081 1726882401.86495: done getting the remaining hosts for this loop 12081 1726882401.86499: getting the next task for host managed_node3 12081 1726882401.86511: done getting next task for host managed_node3 12081 1726882401.86514: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12081 1726882401.86520: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.86524: getting variables 12081 1726882401.86526: in VariableManager get_vars() 12081 1726882401.86566: Calling all_inventory to load vars for managed_node3 12081 1726882401.86569: Calling groups_inventory to load vars for managed_node3 12081 1726882401.86573: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.86586: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.86589: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.86592: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.88865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.90600: done with get_vars() 12081 1726882401.90632: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:21 -0400 (0:00:00.092) 0:00:21.710 ****** 12081 1726882401.90740: entering _queue_task() for managed_node3/include_tasks 12081 1726882401.91080: worker is 1 (out of 1 available) 12081 1726882401.91091: exiting _queue_task() for managed_node3/include_tasks 12081 1726882401.91104: done queuing things up, now waiting for results queue to drain 12081 1726882401.91105: waiting for pending results... 12081 1726882401.91402: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 12081 1726882401.91521: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004e7 12081 1726882401.91535: variable 'ansible_search_path' from source: unknown 12081 1726882401.91539: variable 'ansible_search_path' from source: unknown 12081 1726882401.91585: calling self._execute() 12081 1726882401.91680: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882401.91684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882401.91694: variable 'omit' from source: magic vars 12081 1726882401.92062: variable 'ansible_distribution_major_version' from source: facts 12081 1726882401.92077: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882401.92082: _execute() done 12081 1726882401.92086: dumping result to json 12081 1726882401.92088: done dumping result, returning 12081 1726882401.92099: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-0000000004e7] 12081 1726882401.92107: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e7 12081 1726882401.92229: no more pending results, returning what we have 12081 1726882401.92235: in VariableManager get_vars() 12081 1726882401.92278: Calling all_inventory to load vars for managed_node3 12081 1726882401.92281: Calling groups_inventory to load vars for managed_node3 12081 1726882401.92285: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.92300: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.92304: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.92307: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.92828: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e7 12081 1726882401.92831: WORKER PROCESS EXITING 12081 1726882401.94125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882401.95841: done with get_vars() 12081 1726882401.95872: variable 'ansible_search_path' from source: unknown 12081 1726882401.95874: variable 'ansible_search_path' from source: unknown 12081 1726882401.95913: we have included files to process 12081 1726882401.95914: generating all_blocks data 12081 1726882401.95916: done generating all_blocks data 12081 1726882401.95920: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882401.95921: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882401.95924: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12081 1726882401.96859: done processing included file 12081 1726882401.96862: iterating over new_blocks loaded from include file 12081 1726882401.96865: in VariableManager get_vars() 12081 1726882401.96881: done with get_vars() 12081 1726882401.96883: filtering new block on tags 12081 1726882401.96961: done filtering new block on tags 12081 1726882401.96966: in VariableManager get_vars() 12081 1726882401.96981: done with get_vars() 12081 1726882401.96983: filtering new block on tags 12081 1726882401.97041: done filtering new block on tags 12081 1726882401.97043: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 12081 1726882401.97049: extending task lists for all hosts with included blocks 12081 1726882401.97482: done extending task lists 12081 1726882401.97484: done processing included files 12081 1726882401.97485: results queue empty 12081 1726882401.97485: checking for any_errors_fatal 12081 1726882401.97488: done checking for any_errors_fatal 12081 1726882401.97489: checking for max_fail_percentage 12081 1726882401.97490: done checking for max_fail_percentage 12081 1726882401.97491: checking to see if all hosts have failed and the running result is not ok 12081 1726882401.97492: done checking to see if all hosts have failed 12081 1726882401.97492: getting the remaining hosts for this loop 12081 1726882401.97494: done getting the remaining hosts for this loop 12081 1726882401.97496: getting the next task for host managed_node3 12081 1726882401.97501: done getting next task for host managed_node3 12081 1726882401.97503: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12081 1726882401.97507: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882401.97509: getting variables 12081 1726882401.97510: in VariableManager get_vars() 12081 1726882401.97519: Calling all_inventory to load vars for managed_node3 12081 1726882401.97521: Calling groups_inventory to load vars for managed_node3 12081 1726882401.97524: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882401.97529: Calling all_plugins_play to load vars for managed_node3 12081 1726882401.97531: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882401.97534: Calling groups_plugins_play to load vars for managed_node3 12081 1726882401.98743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882402.01094: done with get_vars() 12081 1726882402.01117: done getting variables 12081 1726882402.01870: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:22 -0400 (0:00:00.111) 0:00:21.822 ****** 12081 1726882402.01906: entering _queue_task() for managed_node3/set_fact 12081 1726882402.02254: worker is 1 (out of 1 available) 12081 1726882402.02268: exiting _queue_task() for managed_node3/set_fact 12081 1726882402.02282: done queuing things up, now waiting for results queue to drain 12081 1726882402.02283: waiting for pending results... 12081 1726882402.03291: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 12081 1726882402.03430: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005b4 12081 1726882402.03570: variable 'ansible_search_path' from source: unknown 12081 1726882402.03578: variable 'ansible_search_path' from source: unknown 12081 1726882402.03620: calling self._execute() 12081 1726882402.03752: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.03880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.03894: variable 'omit' from source: magic vars 12081 1726882402.04426: variable 'ansible_distribution_major_version' from source: facts 12081 1726882402.04439: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882402.04450: variable 'omit' from source: magic vars 12081 1726882402.04513: variable 'omit' from source: magic vars 12081 1726882402.04545: variable 'omit' from source: magic vars 12081 1726882402.04593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882402.04624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882402.04644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882402.04668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882402.04682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882402.04711: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882402.04714: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.04717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.04825: Set connection var ansible_pipelining to False 12081 1726882402.04828: Set connection var ansible_shell_type to sh 12081 1726882402.04834: Set connection var ansible_shell_executable to /bin/sh 12081 1726882402.04837: Set connection var ansible_connection to ssh 12081 1726882402.04842: Set connection var ansible_timeout to 10 12081 1726882402.04849: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882402.04879: variable 'ansible_shell_executable' from source: unknown 12081 1726882402.04882: variable 'ansible_connection' from source: unknown 12081 1726882402.04886: variable 'ansible_module_compression' from source: unknown 12081 1726882402.04893: variable 'ansible_shell_type' from source: unknown 12081 1726882402.04896: variable 'ansible_shell_executable' from source: unknown 12081 1726882402.04898: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.04903: variable 'ansible_pipelining' from source: unknown 12081 1726882402.04905: variable 'ansible_timeout' from source: unknown 12081 1726882402.04909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.05053: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882402.05067: variable 'omit' from source: magic vars 12081 1726882402.05074: starting attempt loop 12081 1726882402.05077: running the handler 12081 1726882402.05090: handler run complete 12081 1726882402.05100: attempt loop complete, returning result 12081 1726882402.05102: _execute() done 12081 1726882402.05105: dumping result to json 12081 1726882402.05113: done dumping result, returning 12081 1726882402.05121: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-0a3f-ff3c-0000000005b4] 12081 1726882402.05128: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b4 12081 1726882402.05216: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b4 12081 1726882402.05219: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12081 1726882402.05283: no more pending results, returning what we have 12081 1726882402.05287: results queue empty 12081 1726882402.05288: checking for any_errors_fatal 12081 1726882402.05290: done checking for any_errors_fatal 12081 1726882402.05290: checking for max_fail_percentage 12081 1726882402.05292: done checking for max_fail_percentage 12081 1726882402.05293: checking to see if all hosts have failed and the running result is not ok 12081 1726882402.05294: done checking to see if all hosts have failed 12081 1726882402.05295: getting the remaining hosts for this loop 12081 1726882402.05297: done getting the remaining hosts for this loop 12081 1726882402.05301: getting the next task for host managed_node3 12081 1726882402.05310: done getting next task for host managed_node3 12081 1726882402.05313: ^ task is: TASK: Stat profile file 12081 1726882402.05319: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882402.05323: getting variables 12081 1726882402.05325: in VariableManager get_vars() 12081 1726882402.05360: Calling all_inventory to load vars for managed_node3 12081 1726882402.05365: Calling groups_inventory to load vars for managed_node3 12081 1726882402.05369: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882402.05383: Calling all_plugins_play to load vars for managed_node3 12081 1726882402.05385: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882402.05388: Calling groups_plugins_play to load vars for managed_node3 12081 1726882402.14520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882402.16306: done with get_vars() 12081 1726882402.16358: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:22 -0400 (0:00:00.145) 0:00:21.967 ****** 12081 1726882402.16479: entering _queue_task() for managed_node3/stat 12081 1726882402.16825: worker is 1 (out of 1 available) 12081 1726882402.16837: exiting _queue_task() for managed_node3/stat 12081 1726882402.16849: done queuing things up, now waiting for results queue to drain 12081 1726882402.16853: waiting for pending results... 12081 1726882402.17448: running TaskExecutor() for managed_node3/TASK: Stat profile file 12081 1726882402.17454: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005b5 12081 1726882402.17457: variable 'ansible_search_path' from source: unknown 12081 1726882402.17461: variable 'ansible_search_path' from source: unknown 12081 1726882402.17466: calling self._execute() 12081 1726882402.17469: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.17472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.17481: variable 'omit' from source: magic vars 12081 1726882402.18141: variable 'ansible_distribution_major_version' from source: facts 12081 1726882402.18153: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882402.18162: variable 'omit' from source: magic vars 12081 1726882402.18235: variable 'omit' from source: magic vars 12081 1726882402.18339: variable 'profile' from source: include params 12081 1726882402.18344: variable 'bond_port_profile' from source: include params 12081 1726882402.18432: variable 'bond_port_profile' from source: include params 12081 1726882402.18449: variable 'omit' from source: magic vars 12081 1726882402.18499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882402.18539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882402.18566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882402.18584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882402.18597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882402.18627: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882402.18630: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.18633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.18744: Set connection var ansible_pipelining to False 12081 1726882402.18747: Set connection var ansible_shell_type to sh 12081 1726882402.18756: Set connection var ansible_shell_executable to /bin/sh 12081 1726882402.18758: Set connection var ansible_connection to ssh 12081 1726882402.18767: Set connection var ansible_timeout to 10 12081 1726882402.18773: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882402.18798: variable 'ansible_shell_executable' from source: unknown 12081 1726882402.18801: variable 'ansible_connection' from source: unknown 12081 1726882402.18803: variable 'ansible_module_compression' from source: unknown 12081 1726882402.18805: variable 'ansible_shell_type' from source: unknown 12081 1726882402.18808: variable 'ansible_shell_executable' from source: unknown 12081 1726882402.18812: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.18814: variable 'ansible_pipelining' from source: unknown 12081 1726882402.18816: variable 'ansible_timeout' from source: unknown 12081 1726882402.18819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.19031: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882402.19043: variable 'omit' from source: magic vars 12081 1726882402.19051: starting attempt loop 12081 1726882402.19057: running the handler 12081 1726882402.19075: _low_level_execute_command(): starting 12081 1726882402.19082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882402.19865: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.19880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.19943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.19966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.20008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.20027: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.20042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.20066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.20076: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.20084: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.20093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.20106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.20118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.20125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.20138: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.20157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.20236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.20254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.20274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.20402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.22110: stdout chunk (state=3): >>>/root <<< 12081 1726882402.22286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.22290: stdout chunk (state=3): >>><<< 12081 1726882402.22300: stderr chunk (state=3): >>><<< 12081 1726882402.22323: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882402.22338: _low_level_execute_command(): starting 12081 1726882402.22341: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186 `" && echo ansible-tmp-1726882402.2232285-13055-31189195803186="` echo /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186 `" ) && sleep 0' 12081 1726882402.23839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.23843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.23891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.23895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.23915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882402.23918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.23991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.23999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.24011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.24135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.26022: stdout chunk (state=3): >>>ansible-tmp-1726882402.2232285-13055-31189195803186=/root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186 <<< 12081 1726882402.26209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.26212: stdout chunk (state=3): >>><<< 12081 1726882402.26219: stderr chunk (state=3): >>><<< 12081 1726882402.26240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882402.2232285-13055-31189195803186=/root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882402.26292: variable 'ansible_module_compression' from source: unknown 12081 1726882402.26359: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882402.26396: variable 'ansible_facts' from source: unknown 12081 1726882402.26483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186/AnsiballZ_stat.py 12081 1726882402.26623: Sending initial data 12081 1726882402.26626: Sent initial data (152 bytes) 12081 1726882402.27592: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.27601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.27612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.27626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.27669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.27676: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.27686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.27699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.27707: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.27713: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.27721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.27730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.27741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.27748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.27758: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.27767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.27841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.27857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.27865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.28001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.29732: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882402.29832: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882402.29937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpgrj9pu3m /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186/AnsiballZ_stat.py <<< 12081 1726882402.30034: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882402.31332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.31513: stderr chunk (state=3): >>><<< 12081 1726882402.31516: stdout chunk (state=3): >>><<< 12081 1726882402.31540: done transferring module to remote 12081 1726882402.31551: _low_level_execute_command(): starting 12081 1726882402.31559: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186/ /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186/AnsiballZ_stat.py && sleep 0' 12081 1726882402.32240: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.32249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.32265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.32286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.32326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.32333: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.32343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.32359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.32369: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.32376: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.32384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.32396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.32409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.32416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.32423: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.32432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.32510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.32531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.32542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.32676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.34515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.34519: stdout chunk (state=3): >>><<< 12081 1726882402.34525: stderr chunk (state=3): >>><<< 12081 1726882402.34544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882402.34548: _low_level_execute_command(): starting 12081 1726882402.34552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186/AnsiballZ_stat.py && sleep 0' 12081 1726882402.35216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.35225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.35236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.35250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.35298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.35304: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.35315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.35328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.35337: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.35344: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.35349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.35362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.35376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.35384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.35390: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.35400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.35477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.35493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.35496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.35638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.48606: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882402.49599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882402.49604: stdout chunk (state=3): >>><<< 12081 1726882402.49607: stderr chunk (state=3): >>><<< 12081 1726882402.49740: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882402.49744: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882402.49752: _low_level_execute_command(): starting 12081 1726882402.49754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882402.2232285-13055-31189195803186/ > /dev/null 2>&1 && sleep 0' 12081 1726882402.50740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.50744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.50781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882402.50784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882402.50786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.50788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.50854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.50858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.50860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.50983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.52771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.52847: stderr chunk (state=3): >>><<< 12081 1726882402.52851: stdout chunk (state=3): >>><<< 12081 1726882402.52970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882402.52974: handler run complete 12081 1726882402.52976: attempt loop complete, returning result 12081 1726882402.52978: _execute() done 12081 1726882402.52980: dumping result to json 12081 1726882402.52982: done dumping result, returning 12081 1726882402.52984: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0e448fcc-3ce9-0a3f-ff3c-0000000005b5] 12081 1726882402.52986: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b5 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12081 1726882402.53119: no more pending results, returning what we have 12081 1726882402.53122: results queue empty 12081 1726882402.53123: checking for any_errors_fatal 12081 1726882402.53131: done checking for any_errors_fatal 12081 1726882402.53131: checking for max_fail_percentage 12081 1726882402.53134: done checking for max_fail_percentage 12081 1726882402.53134: checking to see if all hosts have failed and the running result is not ok 12081 1726882402.53135: done checking to see if all hosts have failed 12081 1726882402.53136: getting the remaining hosts for this loop 12081 1726882402.53138: done getting the remaining hosts for this loop 12081 1726882402.53141: getting the next task for host managed_node3 12081 1726882402.53149: done getting next task for host managed_node3 12081 1726882402.53154: ^ task is: TASK: Set NM profile exist flag based on the profile files 12081 1726882402.53161: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882402.53167: getting variables 12081 1726882402.53168: in VariableManager get_vars() 12081 1726882402.53200: Calling all_inventory to load vars for managed_node3 12081 1726882402.53203: Calling groups_inventory to load vars for managed_node3 12081 1726882402.53206: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882402.53218: Calling all_plugins_play to load vars for managed_node3 12081 1726882402.53220: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882402.53222: Calling groups_plugins_play to load vars for managed_node3 12081 1726882402.53870: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b5 12081 1726882402.53875: WORKER PROCESS EXITING 12081 1726882402.55007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882402.56942: done with get_vars() 12081 1726882402.56980: done getting variables 12081 1726882402.57042: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:22 -0400 (0:00:00.406) 0:00:22.373 ****** 12081 1726882402.57086: entering _queue_task() for managed_node3/set_fact 12081 1726882402.57455: worker is 1 (out of 1 available) 12081 1726882402.57471: exiting _queue_task() for managed_node3/set_fact 12081 1726882402.57484: done queuing things up, now waiting for results queue to drain 12081 1726882402.57485: waiting for pending results... 12081 1726882402.58174: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 12081 1726882402.58339: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005b6 12081 1726882402.58367: variable 'ansible_search_path' from source: unknown 12081 1726882402.58372: variable 'ansible_search_path' from source: unknown 12081 1726882402.58404: calling self._execute() 12081 1726882402.58557: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.58565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.58577: variable 'omit' from source: magic vars 12081 1726882402.59042: variable 'ansible_distribution_major_version' from source: facts 12081 1726882402.59055: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882402.59157: variable 'profile_stat' from source: set_fact 12081 1726882402.59166: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882402.59170: when evaluation is False, skipping this task 12081 1726882402.59173: _execute() done 12081 1726882402.59175: dumping result to json 12081 1726882402.59178: done dumping result, returning 12081 1726882402.59185: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-0a3f-ff3c-0000000005b6] 12081 1726882402.59191: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b6 12081 1726882402.59284: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b6 12081 1726882402.59288: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882402.59336: no more pending results, returning what we have 12081 1726882402.59340: results queue empty 12081 1726882402.59341: checking for any_errors_fatal 12081 1726882402.59354: done checking for any_errors_fatal 12081 1726882402.59354: checking for max_fail_percentage 12081 1726882402.59357: done checking for max_fail_percentage 12081 1726882402.59358: checking to see if all hosts have failed and the running result is not ok 12081 1726882402.59359: done checking to see if all hosts have failed 12081 1726882402.59360: getting the remaining hosts for this loop 12081 1726882402.59361: done getting the remaining hosts for this loop 12081 1726882402.59367: getting the next task for host managed_node3 12081 1726882402.59375: done getting next task for host managed_node3 12081 1726882402.59378: ^ task is: TASK: Get NM profile info 12081 1726882402.59385: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882402.59388: getting variables 12081 1726882402.59390: in VariableManager get_vars() 12081 1726882402.59421: Calling all_inventory to load vars for managed_node3 12081 1726882402.59424: Calling groups_inventory to load vars for managed_node3 12081 1726882402.59427: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882402.59438: Calling all_plugins_play to load vars for managed_node3 12081 1726882402.59440: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882402.59442: Calling groups_plugins_play to load vars for managed_node3 12081 1726882402.60602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882402.62889: done with get_vars() 12081 1726882402.62926: done getting variables 12081 1726882402.62995: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:22 -0400 (0:00:00.059) 0:00:22.433 ****** 12081 1726882402.63035: entering _queue_task() for managed_node3/shell 12081 1726882402.63404: worker is 1 (out of 1 available) 12081 1726882402.63421: exiting _queue_task() for managed_node3/shell 12081 1726882402.63471: done queuing things up, now waiting for results queue to drain 12081 1726882402.63473: waiting for pending results... 12081 1726882402.63855: running TaskExecutor() for managed_node3/TASK: Get NM profile info 12081 1726882402.63997: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005b7 12081 1726882402.64018: variable 'ansible_search_path' from source: unknown 12081 1726882402.64025: variable 'ansible_search_path' from source: unknown 12081 1726882402.64077: calling self._execute() 12081 1726882402.64184: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.64197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.64210: variable 'omit' from source: magic vars 12081 1726882402.64602: variable 'ansible_distribution_major_version' from source: facts 12081 1726882402.64621: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882402.64631: variable 'omit' from source: magic vars 12081 1726882402.64707: variable 'omit' from source: magic vars 12081 1726882402.64811: variable 'profile' from source: include params 12081 1726882402.64825: variable 'bond_port_profile' from source: include params 12081 1726882402.64895: variable 'bond_port_profile' from source: include params 12081 1726882402.64919: variable 'omit' from source: magic vars 12081 1726882402.64972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882402.65010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882402.65035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882402.65062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882402.65083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882402.65116: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882402.65124: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.65132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.65243: Set connection var ansible_pipelining to False 12081 1726882402.65254: Set connection var ansible_shell_type to sh 12081 1726882402.65271: Set connection var ansible_shell_executable to /bin/sh 12081 1726882402.65279: Set connection var ansible_connection to ssh 12081 1726882402.65289: Set connection var ansible_timeout to 10 12081 1726882402.65298: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882402.65327: variable 'ansible_shell_executable' from source: unknown 12081 1726882402.65335: variable 'ansible_connection' from source: unknown 12081 1726882402.65342: variable 'ansible_module_compression' from source: unknown 12081 1726882402.65348: variable 'ansible_shell_type' from source: unknown 12081 1726882402.65359: variable 'ansible_shell_executable' from source: unknown 12081 1726882402.65370: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882402.65380: variable 'ansible_pipelining' from source: unknown 12081 1726882402.65387: variable 'ansible_timeout' from source: unknown 12081 1726882402.65394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882402.65804: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882402.65821: variable 'omit' from source: magic vars 12081 1726882402.65829: starting attempt loop 12081 1726882402.65835: running the handler 12081 1726882402.65849: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882402.65879: _low_level_execute_command(): starting 12081 1726882402.65890: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882402.67584: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.67611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.67628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.67653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.67702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.67738: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.67755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.67777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.67789: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.67802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.67815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.67828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.67844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.67859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.67874: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.67892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.67976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.67994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.68009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.68145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.69811: stdout chunk (state=3): >>>/root <<< 12081 1726882402.69989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.69992: stdout chunk (state=3): >>><<< 12081 1726882402.70003: stderr chunk (state=3): >>><<< 12081 1726882402.70027: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882402.70042: _low_level_execute_command(): starting 12081 1726882402.70048: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146 `" && echo ansible-tmp-1726882402.700273-13078-107914971991146="` echo /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146 `" ) && sleep 0' 12081 1726882402.70688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.70696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.70708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.70720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.70759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.70768: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.70778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.70792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.70801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.70807: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.70814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.70825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.70834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.70841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.70848: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.70857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.70944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.70948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.70958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.71091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.72993: stdout chunk (state=3): >>>ansible-tmp-1726882402.700273-13078-107914971991146=/root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146 <<< 12081 1726882402.73097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.73190: stderr chunk (state=3): >>><<< 12081 1726882402.73193: stdout chunk (state=3): >>><<< 12081 1726882402.73218: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882402.700273-13078-107914971991146=/root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882402.73253: variable 'ansible_module_compression' from source: unknown 12081 1726882402.73309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882402.73345: variable 'ansible_facts' from source: unknown 12081 1726882402.73438: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146/AnsiballZ_command.py 12081 1726882402.73583: Sending initial data 12081 1726882402.73587: Sent initial data (155 bytes) 12081 1726882402.74519: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.74526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.74535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.74549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.74591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.74599: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.74608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.74622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.74631: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.74637: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.74646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.74655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.74666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.74682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.74689: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.74699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.74770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.74785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.74791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.74922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.76656: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882402.76749: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882402.76853: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpufk5xfj6 /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146/AnsiballZ_command.py <<< 12081 1726882402.76949: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882402.78229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.78395: stderr chunk (state=3): >>><<< 12081 1726882402.78399: stdout chunk (state=3): >>><<< 12081 1726882402.78420: done transferring module to remote 12081 1726882402.78432: _low_level_execute_command(): starting 12081 1726882402.78438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146/ /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146/AnsiballZ_command.py && sleep 0' 12081 1726882402.79129: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.79137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.79147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.79165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.79208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.79215: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.79225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.79238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.79245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.79254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.79259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.79273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.79289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.79296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.79302: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.79311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.79377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.79403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.79415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.79542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.81396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882402.81400: stdout chunk (state=3): >>><<< 12081 1726882402.81406: stderr chunk (state=3): >>><<< 12081 1726882402.81424: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882402.81428: _low_level_execute_command(): starting 12081 1726882402.81433: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146/AnsiballZ_command.py && sleep 0' 12081 1726882402.82064: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882402.82076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.82087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.82103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.82141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.82148: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.82158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.82175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.82183: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882402.82189: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882402.82197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.82206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.82218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.82226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.82233: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882402.82240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.82315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.82333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.82345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.82493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882402.97797: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:22.954124", "end": "2024-09-20 21:33:22.976582", "delta": "0:00:00.022458", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882402.98928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882402.98984: stderr chunk (state=3): >>><<< 12081 1726882402.98988: stdout chunk (state=3): >>><<< 12081 1726882402.99002: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:22.954124", "end": "2024-09-20 21:33:22.976582", "delta": "0:00:00.022458", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882402.99041: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882402.99049: _low_level_execute_command(): starting 12081 1726882402.99057: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882402.700273-13078-107914971991146/ > /dev/null 2>&1 && sleep 0' 12081 1726882402.99502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.99507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.99518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.99547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882402.99556: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882402.99569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.99577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882402.99583: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882402.99590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882402.99611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882402.99614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882402.99661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882402.99666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882402.99679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882402.99794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882403.01599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882403.01805: stderr chunk (state=3): >>><<< 12081 1726882403.01808: stdout chunk (state=3): >>><<< 12081 1726882403.01814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882403.01817: handler run complete 12081 1726882403.01818: Evaluated conditional (False): False 12081 1726882403.01820: attempt loop complete, returning result 12081 1726882403.01822: _execute() done 12081 1726882403.01823: dumping result to json 12081 1726882403.01825: done dumping result, returning 12081 1726882403.01826: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0e448fcc-3ce9-0a3f-ff3c-0000000005b7] 12081 1726882403.01828: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b7 12081 1726882403.01896: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b7 12081 1726882403.01899: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022458", "end": "2024-09-20 21:33:22.976582", "rc": 0, "start": "2024-09-20 21:33:22.954124" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 12081 1726882403.01978: no more pending results, returning what we have 12081 1726882403.01982: results queue empty 12081 1726882403.01982: checking for any_errors_fatal 12081 1726882403.01988: done checking for any_errors_fatal 12081 1726882403.01989: checking for max_fail_percentage 12081 1726882403.01991: done checking for max_fail_percentage 12081 1726882403.01992: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.01992: done checking to see if all hosts have failed 12081 1726882403.01993: getting the remaining hosts for this loop 12081 1726882403.01995: done getting the remaining hosts for this loop 12081 1726882403.01998: getting the next task for host managed_node3 12081 1726882403.02006: done getting next task for host managed_node3 12081 1726882403.02010: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12081 1726882403.02016: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.02020: getting variables 12081 1726882403.02021: in VariableManager get_vars() 12081 1726882403.02048: Calling all_inventory to load vars for managed_node3 12081 1726882403.02051: Calling groups_inventory to load vars for managed_node3 12081 1726882403.02060: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.02191: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.02195: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.02198: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.04232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.05169: done with get_vars() 12081 1726882403.05187: done getting variables 12081 1726882403.05230: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:23 -0400 (0:00:00.422) 0:00:22.855 ****** 12081 1726882403.05255: entering _queue_task() for managed_node3/set_fact 12081 1726882403.05486: worker is 1 (out of 1 available) 12081 1726882403.05499: exiting _queue_task() for managed_node3/set_fact 12081 1726882403.05511: done queuing things up, now waiting for results queue to drain 12081 1726882403.05513: waiting for pending results... 12081 1726882403.05703: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12081 1726882403.06049: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005b8 12081 1726882403.06068: variable 'ansible_search_path' from source: unknown 12081 1726882403.06072: variable 'ansible_search_path' from source: unknown 12081 1726882403.06152: calling self._execute() 12081 1726882403.06290: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.06294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.06305: variable 'omit' from source: magic vars 12081 1726882403.07429: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.07444: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.07559: variable 'nm_profile_exists' from source: set_fact 12081 1726882403.07576: Evaluated conditional (nm_profile_exists.rc == 0): True 12081 1726882403.07593: variable 'omit' from source: magic vars 12081 1726882403.07662: variable 'omit' from source: magic vars 12081 1726882403.07698: variable 'omit' from source: magic vars 12081 1726882403.07748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882403.07786: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882403.07812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882403.07839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.07855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.07892: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882403.07902: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.07909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.08020: Set connection var ansible_pipelining to False 12081 1726882403.08028: Set connection var ansible_shell_type to sh 12081 1726882403.08044: Set connection var ansible_shell_executable to /bin/sh 12081 1726882403.08057: Set connection var ansible_connection to ssh 12081 1726882403.08070: Set connection var ansible_timeout to 10 12081 1726882403.08080: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882403.08112: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.08120: variable 'ansible_connection' from source: unknown 12081 1726882403.08128: variable 'ansible_module_compression' from source: unknown 12081 1726882403.08134: variable 'ansible_shell_type' from source: unknown 12081 1726882403.08141: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.08147: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.08157: variable 'ansible_pipelining' from source: unknown 12081 1726882403.08169: variable 'ansible_timeout' from source: unknown 12081 1726882403.08177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.08326: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882403.08342: variable 'omit' from source: magic vars 12081 1726882403.08352: starting attempt loop 12081 1726882403.08358: running the handler 12081 1726882403.08382: handler run complete 12081 1726882403.08397: attempt loop complete, returning result 12081 1726882403.08403: _execute() done 12081 1726882403.08409: dumping result to json 12081 1726882403.08415: done dumping result, returning 12081 1726882403.08426: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-0a3f-ff3c-0000000005b8] 12081 1726882403.08435: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b8 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12081 1726882403.08597: no more pending results, returning what we have 12081 1726882403.08601: results queue empty 12081 1726882403.08602: checking for any_errors_fatal 12081 1726882403.08611: done checking for any_errors_fatal 12081 1726882403.08612: checking for max_fail_percentage 12081 1726882403.08614: done checking for max_fail_percentage 12081 1726882403.08615: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.08616: done checking to see if all hosts have failed 12081 1726882403.08616: getting the remaining hosts for this loop 12081 1726882403.08618: done getting the remaining hosts for this loop 12081 1726882403.08622: getting the next task for host managed_node3 12081 1726882403.08631: done getting next task for host managed_node3 12081 1726882403.08634: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12081 1726882403.08641: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.08645: getting variables 12081 1726882403.08647: in VariableManager get_vars() 12081 1726882403.08683: Calling all_inventory to load vars for managed_node3 12081 1726882403.08686: Calling groups_inventory to load vars for managed_node3 12081 1726882403.08689: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.08703: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.08705: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.08708: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.09229: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005b8 12081 1726882403.09232: WORKER PROCESS EXITING 12081 1726882403.10384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.13528: done with get_vars() 12081 1726882403.13557: done getting variables 12081 1726882403.13605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882403.13698: variable 'profile' from source: include params 12081 1726882403.13701: variable 'bond_port_profile' from source: include params 12081 1726882403.13757: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:23 -0400 (0:00:00.085) 0:00:22.940 ****** 12081 1726882403.13784: entering _queue_task() for managed_node3/command 12081 1726882403.14068: worker is 1 (out of 1 available) 12081 1726882403.14080: exiting _queue_task() for managed_node3/command 12081 1726882403.14094: done queuing things up, now waiting for results queue to drain 12081 1726882403.14095: waiting for pending results... 12081 1726882403.14288: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 12081 1726882403.14379: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005ba 12081 1726882403.14390: variable 'ansible_search_path' from source: unknown 12081 1726882403.14395: variable 'ansible_search_path' from source: unknown 12081 1726882403.14428: calling self._execute() 12081 1726882403.14503: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.14509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.14516: variable 'omit' from source: magic vars 12081 1726882403.14794: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.14806: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.14895: variable 'profile_stat' from source: set_fact 12081 1726882403.14903: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882403.14907: when evaluation is False, skipping this task 12081 1726882403.14910: _execute() done 12081 1726882403.14913: dumping result to json 12081 1726882403.14915: done dumping result, returning 12081 1726882403.14920: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-0a3f-ff3c-0000000005ba] 12081 1726882403.14928: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005ba 12081 1726882403.15016: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005ba 12081 1726882403.15019: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882403.15076: no more pending results, returning what we have 12081 1726882403.15080: results queue empty 12081 1726882403.15080: checking for any_errors_fatal 12081 1726882403.15094: done checking for any_errors_fatal 12081 1726882403.15095: checking for max_fail_percentage 12081 1726882403.15097: done checking for max_fail_percentage 12081 1726882403.15098: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.15099: done checking to see if all hosts have failed 12081 1726882403.15100: getting the remaining hosts for this loop 12081 1726882403.15102: done getting the remaining hosts for this loop 12081 1726882403.15105: getting the next task for host managed_node3 12081 1726882403.15115: done getting next task for host managed_node3 12081 1726882403.15118: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12081 1726882403.15124: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.15133: getting variables 12081 1726882403.15135: in VariableManager get_vars() 12081 1726882403.15171: Calling all_inventory to load vars for managed_node3 12081 1726882403.15178: Calling groups_inventory to load vars for managed_node3 12081 1726882403.15182: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.15193: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.15195: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.15198: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.16092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.17612: done with get_vars() 12081 1726882403.17650: done getting variables 12081 1726882403.17702: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882403.17871: variable 'profile' from source: include params 12081 1726882403.17876: variable 'bond_port_profile' from source: include params 12081 1726882403.17937: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:23 -0400 (0:00:00.041) 0:00:22.982 ****** 12081 1726882403.17966: entering _queue_task() for managed_node3/set_fact 12081 1726882403.18272: worker is 1 (out of 1 available) 12081 1726882403.18285: exiting _queue_task() for managed_node3/set_fact 12081 1726882403.18297: done queuing things up, now waiting for results queue to drain 12081 1726882403.18299: waiting for pending results... 12081 1726882403.18577: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 12081 1726882403.18665: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005bb 12081 1726882403.18676: variable 'ansible_search_path' from source: unknown 12081 1726882403.18679: variable 'ansible_search_path' from source: unknown 12081 1726882403.18711: calling self._execute() 12081 1726882403.18787: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.18791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.18799: variable 'omit' from source: magic vars 12081 1726882403.19080: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.19091: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.19180: variable 'profile_stat' from source: set_fact 12081 1726882403.19186: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882403.19188: when evaluation is False, skipping this task 12081 1726882403.19192: _execute() done 12081 1726882403.19194: dumping result to json 12081 1726882403.19197: done dumping result, returning 12081 1726882403.19205: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-0a3f-ff3c-0000000005bb] 12081 1726882403.19211: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005bb 12081 1726882403.19299: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005bb 12081 1726882403.19302: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882403.19353: no more pending results, returning what we have 12081 1726882403.19357: results queue empty 12081 1726882403.19358: checking for any_errors_fatal 12081 1726882403.19366: done checking for any_errors_fatal 12081 1726882403.19367: checking for max_fail_percentage 12081 1726882403.19369: done checking for max_fail_percentage 12081 1726882403.19370: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.19371: done checking to see if all hosts have failed 12081 1726882403.19372: getting the remaining hosts for this loop 12081 1726882403.19373: done getting the remaining hosts for this loop 12081 1726882403.19377: getting the next task for host managed_node3 12081 1726882403.19385: done getting next task for host managed_node3 12081 1726882403.19388: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12081 1726882403.19395: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.19400: getting variables 12081 1726882403.19401: in VariableManager get_vars() 12081 1726882403.19437: Calling all_inventory to load vars for managed_node3 12081 1726882403.19439: Calling groups_inventory to load vars for managed_node3 12081 1726882403.19442: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.19530: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.19533: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.19536: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.20635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.21829: done with get_vars() 12081 1726882403.21861: done getting variables 12081 1726882403.21927: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882403.22034: variable 'profile' from source: include params 12081 1726882403.22037: variable 'bond_port_profile' from source: include params 12081 1726882403.22092: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:23 -0400 (0:00:00.041) 0:00:23.024 ****** 12081 1726882403.22117: entering _queue_task() for managed_node3/command 12081 1726882403.22422: worker is 1 (out of 1 available) 12081 1726882403.22436: exiting _queue_task() for managed_node3/command 12081 1726882403.22450: done queuing things up, now waiting for results queue to drain 12081 1726882403.22451: waiting for pending results... 12081 1726882403.22783: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 12081 1726882403.22977: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005bc 12081 1726882403.23000: variable 'ansible_search_path' from source: unknown 12081 1726882403.23003: variable 'ansible_search_path' from source: unknown 12081 1726882403.23046: calling self._execute() 12081 1726882403.23167: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.23173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.23193: variable 'omit' from source: magic vars 12081 1726882403.23581: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.23601: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.23697: variable 'profile_stat' from source: set_fact 12081 1726882403.23710: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882403.23713: when evaluation is False, skipping this task 12081 1726882403.23716: _execute() done 12081 1726882403.23719: dumping result to json 12081 1726882403.23721: done dumping result, returning 12081 1726882403.23726: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-0a3f-ff3c-0000000005bc] 12081 1726882403.23741: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005bc 12081 1726882403.24025: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005bc 12081 1726882403.24028: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882403.24128: no more pending results, returning what we have 12081 1726882403.24143: results queue empty 12081 1726882403.24144: checking for any_errors_fatal 12081 1726882403.24167: done checking for any_errors_fatal 12081 1726882403.24168: checking for max_fail_percentage 12081 1726882403.24186: done checking for max_fail_percentage 12081 1726882403.24188: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.24204: done checking to see if all hosts have failed 12081 1726882403.24205: getting the remaining hosts for this loop 12081 1726882403.24207: done getting the remaining hosts for this loop 12081 1726882403.24231: getting the next task for host managed_node3 12081 1726882403.24239: done getting next task for host managed_node3 12081 1726882403.24241: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12081 1726882403.24308: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.24325: getting variables 12081 1726882403.24332: in VariableManager get_vars() 12081 1726882403.24403: Calling all_inventory to load vars for managed_node3 12081 1726882403.24410: Calling groups_inventory to load vars for managed_node3 12081 1726882403.24414: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.24428: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.24473: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.24509: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.25652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.26971: done with get_vars() 12081 1726882403.26992: done getting variables 12081 1726882403.27051: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882403.27167: variable 'profile' from source: include params 12081 1726882403.27171: variable 'bond_port_profile' from source: include params 12081 1726882403.27233: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:23 -0400 (0:00:00.051) 0:00:23.075 ****** 12081 1726882403.27269: entering _queue_task() for managed_node3/set_fact 12081 1726882403.27527: worker is 1 (out of 1 available) 12081 1726882403.27541: exiting _queue_task() for managed_node3/set_fact 12081 1726882403.27555: done queuing things up, now waiting for results queue to drain 12081 1726882403.27557: waiting for pending results... 12081 1726882403.27787: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 12081 1726882403.27891: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000005bd 12081 1726882403.27903: variable 'ansible_search_path' from source: unknown 12081 1726882403.27906: variable 'ansible_search_path' from source: unknown 12081 1726882403.27938: calling self._execute() 12081 1726882403.28023: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.28027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.28040: variable 'omit' from source: magic vars 12081 1726882403.28438: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.28472: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.28552: variable 'profile_stat' from source: set_fact 12081 1726882403.28567: Evaluated conditional (profile_stat.stat.exists): False 12081 1726882403.28573: when evaluation is False, skipping this task 12081 1726882403.28585: _execute() done 12081 1726882403.28589: dumping result to json 12081 1726882403.28592: done dumping result, returning 12081 1726882403.28594: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-0a3f-ff3c-0000000005bd] 12081 1726882403.28597: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005bd 12081 1726882403.28690: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000005bd 12081 1726882403.28696: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12081 1726882403.28816: no more pending results, returning what we have 12081 1726882403.28820: results queue empty 12081 1726882403.28821: checking for any_errors_fatal 12081 1726882403.28826: done checking for any_errors_fatal 12081 1726882403.28827: checking for max_fail_percentage 12081 1726882403.28828: done checking for max_fail_percentage 12081 1726882403.28829: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.28830: done checking to see if all hosts have failed 12081 1726882403.28830: getting the remaining hosts for this loop 12081 1726882403.28832: done getting the remaining hosts for this loop 12081 1726882403.28834: getting the next task for host managed_node3 12081 1726882403.28843: done getting next task for host managed_node3 12081 1726882403.28845: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12081 1726882403.28850: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.28856: getting variables 12081 1726882403.28858: in VariableManager get_vars() 12081 1726882403.28922: Calling all_inventory to load vars for managed_node3 12081 1726882403.28924: Calling groups_inventory to load vars for managed_node3 12081 1726882403.28927: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.28943: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.28946: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.28949: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.30333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.31860: done with get_vars() 12081 1726882403.31894: done getting variables 12081 1726882403.31973: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882403.32085: variable 'profile' from source: include params 12081 1726882403.32089: variable 'bond_port_profile' from source: include params 12081 1726882403.32147: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:23 -0400 (0:00:00.049) 0:00:23.124 ****** 12081 1726882403.32187: entering _queue_task() for managed_node3/assert 12081 1726882403.32439: worker is 1 (out of 1 available) 12081 1726882403.32451: exiting _queue_task() for managed_node3/assert 12081 1726882403.32465: done queuing things up, now waiting for results queue to drain 12081 1726882403.32466: waiting for pending results... 12081 1726882403.32701: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 12081 1726882403.32847: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004e8 12081 1726882403.32861: variable 'ansible_search_path' from source: unknown 12081 1726882403.32867: variable 'ansible_search_path' from source: unknown 12081 1726882403.32908: calling self._execute() 12081 1726882403.32995: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.33174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.33179: variable 'omit' from source: magic vars 12081 1726882403.33429: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.33433: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.33469: variable 'omit' from source: magic vars 12081 1726882403.33503: variable 'omit' from source: magic vars 12081 1726882403.33730: variable 'profile' from source: include params 12081 1726882403.33734: variable 'bond_port_profile' from source: include params 12081 1726882403.33737: variable 'bond_port_profile' from source: include params 12081 1726882403.33739: variable 'omit' from source: magic vars 12081 1726882403.33741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882403.33882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882403.33886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882403.33888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.33890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.33893: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882403.33895: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.33897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.33980: Set connection var ansible_pipelining to False 12081 1726882403.33985: Set connection var ansible_shell_type to sh 12081 1726882403.33987: Set connection var ansible_shell_executable to /bin/sh 12081 1726882403.33990: Set connection var ansible_connection to ssh 12081 1726882403.33992: Set connection var ansible_timeout to 10 12081 1726882403.34010: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882403.34030: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.34033: variable 'ansible_connection' from source: unknown 12081 1726882403.34035: variable 'ansible_module_compression' from source: unknown 12081 1726882403.34038: variable 'ansible_shell_type' from source: unknown 12081 1726882403.34040: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.34042: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.34044: variable 'ansible_pipelining' from source: unknown 12081 1726882403.34046: variable 'ansible_timeout' from source: unknown 12081 1726882403.34048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.34233: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882403.34237: variable 'omit' from source: magic vars 12081 1726882403.34239: starting attempt loop 12081 1726882403.34242: running the handler 12081 1726882403.34370: variable 'lsr_net_profile_exists' from source: set_fact 12081 1726882403.34386: Evaluated conditional (lsr_net_profile_exists): True 12081 1726882403.34390: handler run complete 12081 1726882403.34392: attempt loop complete, returning result 12081 1726882403.34394: _execute() done 12081 1726882403.34396: dumping result to json 12081 1726882403.34399: done dumping result, returning 12081 1726882403.34401: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [0e448fcc-3ce9-0a3f-ff3c-0000000004e8] 12081 1726882403.34403: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e8 12081 1726882403.34600: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e8 12081 1726882403.34603: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882403.34728: no more pending results, returning what we have 12081 1726882403.34731: results queue empty 12081 1726882403.34732: checking for any_errors_fatal 12081 1726882403.34737: done checking for any_errors_fatal 12081 1726882403.34737: checking for max_fail_percentage 12081 1726882403.34739: done checking for max_fail_percentage 12081 1726882403.34740: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.34741: done checking to see if all hosts have failed 12081 1726882403.34742: getting the remaining hosts for this loop 12081 1726882403.34743: done getting the remaining hosts for this loop 12081 1726882403.34746: getting the next task for host managed_node3 12081 1726882403.34752: done getting next task for host managed_node3 12081 1726882403.34754: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12081 1726882403.34760: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.34766: getting variables 12081 1726882403.34767: in VariableManager get_vars() 12081 1726882403.34794: Calling all_inventory to load vars for managed_node3 12081 1726882403.34797: Calling groups_inventory to load vars for managed_node3 12081 1726882403.34800: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.34810: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.34812: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.34815: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.36283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.38151: done with get_vars() 12081 1726882403.38176: done getting variables 12081 1726882403.38236: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882403.38359: variable 'profile' from source: include params 12081 1726882403.38365: variable 'bond_port_profile' from source: include params 12081 1726882403.38422: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:23 -0400 (0:00:00.062) 0:00:23.187 ****** 12081 1726882403.38455: entering _queue_task() for managed_node3/assert 12081 1726882403.38774: worker is 1 (out of 1 available) 12081 1726882403.38790: exiting _queue_task() for managed_node3/assert 12081 1726882403.38805: done queuing things up, now waiting for results queue to drain 12081 1726882403.38806: waiting for pending results... 12081 1726882403.39127: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 12081 1726882403.39244: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004e9 12081 1726882403.39266: variable 'ansible_search_path' from source: unknown 12081 1726882403.39270: variable 'ansible_search_path' from source: unknown 12081 1726882403.39310: calling self._execute() 12081 1726882403.39406: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.39409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.39419: variable 'omit' from source: magic vars 12081 1726882403.39795: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.39813: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.39819: variable 'omit' from source: magic vars 12081 1726882403.39877: variable 'omit' from source: magic vars 12081 1726882403.39976: variable 'profile' from source: include params 12081 1726882403.39980: variable 'bond_port_profile' from source: include params 12081 1726882403.40046: variable 'bond_port_profile' from source: include params 12081 1726882403.40066: variable 'omit' from source: magic vars 12081 1726882403.40110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882403.40148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882403.40170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882403.40188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.40201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.40226: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882403.40229: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.40233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.40378: Set connection var ansible_pipelining to False 12081 1726882403.40381: Set connection var ansible_shell_type to sh 12081 1726882403.40384: Set connection var ansible_shell_executable to /bin/sh 12081 1726882403.40386: Set connection var ansible_connection to ssh 12081 1726882403.40388: Set connection var ansible_timeout to 10 12081 1726882403.40391: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882403.40393: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.40395: variable 'ansible_connection' from source: unknown 12081 1726882403.40397: variable 'ansible_module_compression' from source: unknown 12081 1726882403.40399: variable 'ansible_shell_type' from source: unknown 12081 1726882403.40401: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.40403: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.40770: variable 'ansible_pipelining' from source: unknown 12081 1726882403.40774: variable 'ansible_timeout' from source: unknown 12081 1726882403.40780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.40783: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882403.40786: variable 'omit' from source: magic vars 12081 1726882403.40788: starting attempt loop 12081 1726882403.40789: running the handler 12081 1726882403.40791: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12081 1726882403.40793: Evaluated conditional (lsr_net_profile_ansible_managed): True 12081 1726882403.40794: handler run complete 12081 1726882403.40796: attempt loop complete, returning result 12081 1726882403.40797: _execute() done 12081 1726882403.40799: dumping result to json 12081 1726882403.40801: done dumping result, returning 12081 1726882403.40802: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0e448fcc-3ce9-0a3f-ff3c-0000000004e9] 12081 1726882403.40804: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e9 12081 1726882403.40867: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004e9 12081 1726882403.40870: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882403.40935: no more pending results, returning what we have 12081 1726882403.40939: results queue empty 12081 1726882403.40940: checking for any_errors_fatal 12081 1726882403.40946: done checking for any_errors_fatal 12081 1726882403.40947: checking for max_fail_percentage 12081 1726882403.40949: done checking for max_fail_percentage 12081 1726882403.40950: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.40951: done checking to see if all hosts have failed 12081 1726882403.40952: getting the remaining hosts for this loop 12081 1726882403.40954: done getting the remaining hosts for this loop 12081 1726882403.40958: getting the next task for host managed_node3 12081 1726882403.40967: done getting next task for host managed_node3 12081 1726882403.40970: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12081 1726882403.40976: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.40981: getting variables 12081 1726882403.40983: in VariableManager get_vars() 12081 1726882403.41017: Calling all_inventory to load vars for managed_node3 12081 1726882403.41020: Calling groups_inventory to load vars for managed_node3 12081 1726882403.41024: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.41037: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.41039: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.41042: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.42747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.44487: done with get_vars() 12081 1726882403.44518: done getting variables 12081 1726882403.44581: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882403.44704: variable 'profile' from source: include params 12081 1726882403.44708: variable 'bond_port_profile' from source: include params 12081 1726882403.44769: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:23 -0400 (0:00:00.063) 0:00:23.250 ****** 12081 1726882403.44803: entering _queue_task() for managed_node3/assert 12081 1726882403.45133: worker is 1 (out of 1 available) 12081 1726882403.45144: exiting _queue_task() for managed_node3/assert 12081 1726882403.45156: done queuing things up, now waiting for results queue to drain 12081 1726882403.45157: waiting for pending results... 12081 1726882403.45471: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 12081 1726882403.45573: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000004ea 12081 1726882403.45588: variable 'ansible_search_path' from source: unknown 12081 1726882403.45592: variable 'ansible_search_path' from source: unknown 12081 1726882403.45638: calling self._execute() 12081 1726882403.45738: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.45742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.45754: variable 'omit' from source: magic vars 12081 1726882403.46131: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.46149: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.46156: variable 'omit' from source: magic vars 12081 1726882403.46218: variable 'omit' from source: magic vars 12081 1726882403.46319: variable 'profile' from source: include params 12081 1726882403.46322: variable 'bond_port_profile' from source: include params 12081 1726882403.46391: variable 'bond_port_profile' from source: include params 12081 1726882403.46409: variable 'omit' from source: magic vars 12081 1726882403.46457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882403.46496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882403.46515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882403.46532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.46543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.46577: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882403.46580: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.46583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.46743: Set connection var ansible_pipelining to False 12081 1726882403.46746: Set connection var ansible_shell_type to sh 12081 1726882403.46749: Set connection var ansible_shell_executable to /bin/sh 12081 1726882403.46754: Set connection var ansible_connection to ssh 12081 1726882403.46756: Set connection var ansible_timeout to 10 12081 1726882403.46759: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882403.46761: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.46765: variable 'ansible_connection' from source: unknown 12081 1726882403.46767: variable 'ansible_module_compression' from source: unknown 12081 1726882403.46769: variable 'ansible_shell_type' from source: unknown 12081 1726882403.46771: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.46773: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.46775: variable 'ansible_pipelining' from source: unknown 12081 1726882403.46777: variable 'ansible_timeout' from source: unknown 12081 1726882403.46779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.46972: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882403.46975: variable 'omit' from source: magic vars 12081 1726882403.46978: starting attempt loop 12081 1726882403.46980: running the handler 12081 1726882403.47081: variable 'lsr_net_profile_fingerprint' from source: set_fact 12081 1726882403.47084: Evaluated conditional (lsr_net_profile_fingerprint): True 12081 1726882403.47086: handler run complete 12081 1726882403.47089: attempt loop complete, returning result 12081 1726882403.47091: _execute() done 12081 1726882403.47093: dumping result to json 12081 1726882403.47096: done dumping result, returning 12081 1726882403.47098: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0e448fcc-3ce9-0a3f-ff3c-0000000004ea] 12081 1726882403.47100: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004ea 12081 1726882403.47245: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000004ea 12081 1726882403.47249: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882403.47304: no more pending results, returning what we have 12081 1726882403.47308: results queue empty 12081 1726882403.47309: checking for any_errors_fatal 12081 1726882403.47316: done checking for any_errors_fatal 12081 1726882403.47317: checking for max_fail_percentage 12081 1726882403.47319: done checking for max_fail_percentage 12081 1726882403.47320: checking to see if all hosts have failed and the running result is not ok 12081 1726882403.47321: done checking to see if all hosts have failed 12081 1726882403.47322: getting the remaining hosts for this loop 12081 1726882403.47324: done getting the remaining hosts for this loop 12081 1726882403.47328: getting the next task for host managed_node3 12081 1726882403.47340: done getting next task for host managed_node3 12081 1726882403.47343: ^ task is: TASK: ** TEST check bond settings 12081 1726882403.47347: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882403.47352: getting variables 12081 1726882403.47354: in VariableManager get_vars() 12081 1726882403.47390: Calling all_inventory to load vars for managed_node3 12081 1726882403.47393: Calling groups_inventory to load vars for managed_node3 12081 1726882403.47397: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882403.47411: Calling all_plugins_play to load vars for managed_node3 12081 1726882403.47414: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882403.47417: Calling groups_plugins_play to load vars for managed_node3 12081 1726882403.49170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882403.50886: done with get_vars() 12081 1726882403.50917: done getting variables 12081 1726882403.50981: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 21:33:23 -0400 (0:00:00.062) 0:00:23.313 ****** 12081 1726882403.51022: entering _queue_task() for managed_node3/command 12081 1726882403.51355: worker is 1 (out of 1 available) 12081 1726882403.51369: exiting _queue_task() for managed_node3/command 12081 1726882403.51383: done queuing things up, now waiting for results queue to drain 12081 1726882403.51385: waiting for pending results... 12081 1726882403.51675: running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings 12081 1726882403.51785: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000400 12081 1726882403.51798: variable 'ansible_search_path' from source: unknown 12081 1726882403.51802: variable 'ansible_search_path' from source: unknown 12081 1726882403.51850: variable 'bond_options_to_assert' from source: play vars 12081 1726882403.52053: variable 'bond_options_to_assert' from source: play vars 12081 1726882403.52251: variable 'omit' from source: magic vars 12081 1726882403.52389: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.52399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.52409: variable 'omit' from source: magic vars 12081 1726882403.52638: variable 'ansible_distribution_major_version' from source: facts 12081 1726882403.52649: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882403.52656: variable 'omit' from source: magic vars 12081 1726882403.52711: variable 'omit' from source: magic vars 12081 1726882403.52910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882403.55424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882403.55500: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882403.55541: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882403.55573: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882403.55598: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882403.55702: variable 'controller_device' from source: play vars 12081 1726882403.55706: variable 'bond_opt' from source: unknown 12081 1726882403.55730: variable 'omit' from source: magic vars 12081 1726882403.55768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882403.55796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882403.55812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882403.55828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.55838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882403.55874: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882403.55878: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.55880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.55980: Set connection var ansible_pipelining to False 12081 1726882403.55983: Set connection var ansible_shell_type to sh 12081 1726882403.55990: Set connection var ansible_shell_executable to /bin/sh 12081 1726882403.55993: Set connection var ansible_connection to ssh 12081 1726882403.55998: Set connection var ansible_timeout to 10 12081 1726882403.56004: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882403.56031: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.56034: variable 'ansible_connection' from source: unknown 12081 1726882403.56037: variable 'ansible_module_compression' from source: unknown 12081 1726882403.56040: variable 'ansible_shell_type' from source: unknown 12081 1726882403.56043: variable 'ansible_shell_executable' from source: unknown 12081 1726882403.56045: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882403.56047: variable 'ansible_pipelining' from source: unknown 12081 1726882403.56050: variable 'ansible_timeout' from source: unknown 12081 1726882403.56056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882403.56158: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882403.56171: variable 'omit' from source: magic vars 12081 1726882403.56177: starting attempt loop 12081 1726882403.56180: running the handler 12081 1726882403.56196: _low_level_execute_command(): starting 12081 1726882403.56199: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882403.56922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882403.56933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882403.56949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882403.56966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882403.57006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882403.57013: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882403.57023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.57037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882403.57044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882403.57056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882403.57065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882403.57076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882403.57089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882403.57095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882403.57102: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882403.57111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.57187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882403.57208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882403.57220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882403.57356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882403.59085: stdout chunk (state=3): >>>/root <<< 12081 1726882403.59188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882403.59236: stderr chunk (state=3): >>><<< 12081 1726882403.59240: stdout chunk (state=3): >>><<< 12081 1726882403.59272: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882403.59282: _low_level_execute_command(): starting 12081 1726882403.59288: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200 `" && echo ansible-tmp-1726882403.592708-13112-111206006027200="` echo /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200 `" ) && sleep 0' 12081 1726882403.59745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882403.59753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882403.59797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.59801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882403.59804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.59873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882403.59876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882403.60008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882403.61885: stdout chunk (state=3): >>>ansible-tmp-1726882403.592708-13112-111206006027200=/root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200 <<< 12081 1726882403.62086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882403.62090: stdout chunk (state=3): >>><<< 12081 1726882403.62109: stderr chunk (state=3): >>><<< 12081 1726882403.62113: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882403.592708-13112-111206006027200=/root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882403.62147: variable 'ansible_module_compression' from source: unknown 12081 1726882403.62213: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882403.62244: variable 'ansible_facts' from source: unknown 12081 1726882403.62340: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200/AnsiballZ_command.py 12081 1726882403.62491: Sending initial data 12081 1726882403.62494: Sent initial data (155 bytes) 12081 1726882403.63254: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882403.63258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882403.63276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882403.63282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882403.63312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882403.63318: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882403.63327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.63336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882403.63342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882403.63346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882403.63356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882403.63365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882403.63378: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882403.63383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.63428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882403.63448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882403.63456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882403.63558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882403.65295: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882403.65393: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882403.65492: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpw_p45m54 /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200/AnsiballZ_command.py <<< 12081 1726882403.65591: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882403.67028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882403.67287: stderr chunk (state=3): >>><<< 12081 1726882403.67292: stdout chunk (state=3): >>><<< 12081 1726882403.67294: done transferring module to remote 12081 1726882403.67297: _low_level_execute_command(): starting 12081 1726882403.67300: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200/ /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200/AnsiballZ_command.py && sleep 0' 12081 1726882403.68019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882403.68027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882403.68050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.68056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882403.68058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882403.68129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882403.68132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882403.68242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882403.69996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882403.70104: stderr chunk (state=3): >>><<< 12081 1726882403.70114: stdout chunk (state=3): >>><<< 12081 1726882403.70239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882403.70242: _low_level_execute_command(): starting 12081 1726882403.70245: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200/AnsiballZ_command.py && sleep 0' 12081 1726882403.71232: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882403.71313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882404.84668: stdout chunk (state=3): >>> {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:33:23.841101", "end": "2024-09-20 21:33:24.845101", "delta": "0:00:01.004000", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882404.86187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882404.86232: stderr chunk (state=3): >>><<< 12081 1726882404.86235: stdout chunk (state=3): >>><<< 12081 1726882404.86257: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:33:23.841101", "end": "2024-09-20 21:33:24.845101", "delta": "0:00:01.004000", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882404.86296: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882404.86303: _low_level_execute_command(): starting 12081 1726882404.86309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882403.592708-13112-111206006027200/ > /dev/null 2>&1 && sleep 0' 12081 1726882404.87000: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882404.87009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.87020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882404.87035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.87076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882404.87084: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882404.87100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882404.87115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882404.87121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882404.87128: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882404.87135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.87144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882404.87165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.87190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882404.87198: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882404.87213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882404.87285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882404.87304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882404.87317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882404.87449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882404.89420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882404.89425: stdout chunk (state=3): >>><<< 12081 1726882404.89430: stderr chunk (state=3): >>><<< 12081 1726882404.89449: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882404.89455: handler run complete 12081 1726882404.89485: Evaluated conditional (False): False 12081 1726882404.89642: variable 'bond_opt' from source: unknown 12081 1726882404.89648: variable 'result' from source: unknown 12081 1726882404.89665: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882404.89681: attempt loop complete, returning result 12081 1726882404.89701: variable 'bond_opt' from source: unknown 12081 1726882404.89767: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'mode', 'value': '802.3ad'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "802.3ad" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:01.004000", "end": "2024-09-20 21:33:24.845101", "rc": 0, "start": "2024-09-20 21:33:23.841101" } STDOUT: 802.3ad 4 12081 1726882404.89991: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882404.89994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882404.89996: variable 'omit' from source: magic vars 12081 1726882404.90063: variable 'ansible_distribution_major_version' from source: facts 12081 1726882404.90068: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882404.90073: variable 'omit' from source: magic vars 12081 1726882404.90086: variable 'omit' from source: magic vars 12081 1726882404.90519: variable 'controller_device' from source: play vars 12081 1726882404.90523: variable 'bond_opt' from source: unknown 12081 1726882404.90591: variable 'omit' from source: magic vars 12081 1726882404.90663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882404.90675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882404.90681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882404.90697: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882404.90700: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882404.90702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882404.90910: Set connection var ansible_pipelining to False 12081 1726882404.90914: Set connection var ansible_shell_type to sh 12081 1726882404.90920: Set connection var ansible_shell_executable to /bin/sh 12081 1726882404.90923: Set connection var ansible_connection to ssh 12081 1726882404.90928: Set connection var ansible_timeout to 10 12081 1726882404.90933: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882404.90957: variable 'ansible_shell_executable' from source: unknown 12081 1726882404.90960: variable 'ansible_connection' from source: unknown 12081 1726882404.90963: variable 'ansible_module_compression' from source: unknown 12081 1726882404.90967: variable 'ansible_shell_type' from source: unknown 12081 1726882404.90969: variable 'ansible_shell_executable' from source: unknown 12081 1726882404.90971: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882404.90974: variable 'ansible_pipelining' from source: unknown 12081 1726882404.91092: variable 'ansible_timeout' from source: unknown 12081 1726882404.91097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882404.91376: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882404.91385: variable 'omit' from source: magic vars 12081 1726882404.91388: starting attempt loop 12081 1726882404.91390: running the handler 12081 1726882404.91399: _low_level_execute_command(): starting 12081 1726882404.91402: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882404.92692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882404.92699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.92709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882404.92724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.92766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882404.92777: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882404.92793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882404.92807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882404.92814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882404.92820: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882404.92828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.92837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882404.92848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.92856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882404.92861: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882404.92874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882404.92948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882404.92990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882404.93004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882404.93144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882404.94732: stdout chunk (state=3): >>>/root <<< 12081 1726882404.94828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882404.94923: stderr chunk (state=3): >>><<< 12081 1726882404.94929: stdout chunk (state=3): >>><<< 12081 1726882404.94954: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882404.94962: _low_level_execute_command(): starting 12081 1726882404.94969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498 `" && echo ansible-tmp-1726882404.9495134-13112-234099636437498="` echo /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498 `" ) && sleep 0' 12081 1726882404.95881: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.95885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.95927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882404.95933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882404.95945: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.95951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.95967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882404.95974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882404.96050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882404.96069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882404.96203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882404.98077: stdout chunk (state=3): >>>ansible-tmp-1726882404.9495134-13112-234099636437498=/root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498 <<< 12081 1726882404.98199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882404.98269: stderr chunk (state=3): >>><<< 12081 1726882404.98272: stdout chunk (state=3): >>><<< 12081 1726882404.98290: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882404.9495134-13112-234099636437498=/root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882404.98318: variable 'ansible_module_compression' from source: unknown 12081 1726882404.98358: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882404.98378: variable 'ansible_facts' from source: unknown 12081 1726882404.98439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498/AnsiballZ_command.py 12081 1726882404.98571: Sending initial data 12081 1726882404.98575: Sent initial data (156 bytes) 12081 1726882404.99560: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882404.99570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.99581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882404.99594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.99632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882404.99639: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882404.99648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882404.99662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882404.99677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882404.99685: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882404.99692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882404.99701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882404.99712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882404.99719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882404.99725: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882404.99735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882404.99808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882404.99822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882404.99832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.00188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.01917: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882405.02016: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882405.02123: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpgz5sl58s /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498/AnsiballZ_command.py <<< 12081 1726882405.02216: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882405.03576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.03758: stderr chunk (state=3): >>><<< 12081 1726882405.03762: stdout chunk (state=3): >>><<< 12081 1726882405.03766: done transferring module to remote 12081 1726882405.03768: _low_level_execute_command(): starting 12081 1726882405.03770: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498/ /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498/AnsiballZ_command.py && sleep 0' 12081 1726882405.04357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.04373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.04386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.04400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.04440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.04450: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.04470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.04486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.04495: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.04504: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.04516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.04527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.04539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.04548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.04559: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.04575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.04649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.04671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.04684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.04810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.06607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.06611: stdout chunk (state=3): >>><<< 12081 1726882405.06617: stderr chunk (state=3): >>><<< 12081 1726882405.06635: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.06638: _low_level_execute_command(): starting 12081 1726882405.06643: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498/AnsiballZ_command.py && sleep 0' 12081 1726882405.07294: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.07302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.07313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.07332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.07368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.07375: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.07384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.07397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.07403: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.07409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.07416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.07424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.07438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.07445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.07450: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.07459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.07527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.07548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.07559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.07702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.21215: stdout chunk (state=3): >>> {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 21:33:25.207730", "end": "2024-09-20 21:33:25.210763", "delta": "0:00:00.003033", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882405.22572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882405.22576: stdout chunk (state=3): >>><<< 12081 1726882405.22578: stderr chunk (state=3): >>><<< 12081 1726882405.22581: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 21:33:25.207730", "end": "2024-09-20 21:33:25.210763", "delta": "0:00:00.003033", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882405.22583: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882405.22586: _low_level_execute_command(): starting 12081 1726882405.22587: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882404.9495134-13112-234099636437498/ > /dev/null 2>&1 && sleep 0' 12081 1726882405.23468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.23476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.23488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.23502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.23544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.23551: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.23560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.23581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.23587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.23594: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.23601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.23610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.23622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.23629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.23635: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.23644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.23717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.23737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.23749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.23883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.25781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.25799: stderr chunk (state=3): >>><<< 12081 1726882405.25803: stdout chunk (state=3): >>><<< 12081 1726882405.25823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.25830: handler run complete 12081 1726882405.25853: Evaluated conditional (False): False 12081 1726882405.26009: variable 'bond_opt' from source: unknown 12081 1726882405.26015: variable 'result' from source: unknown 12081 1726882405.26029: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882405.26040: attempt loop complete, returning result 12081 1726882405.26059: variable 'bond_opt' from source: unknown 12081 1726882405.26129: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_actor_sys_prio', 'value': '65535'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_sys_prio", "value": "65535" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio" ], "delta": "0:00:00.003033", "end": "2024-09-20 21:33:25.210763", "rc": 0, "start": "2024-09-20 21:33:25.207730" } STDOUT: 65535 12081 1726882405.26279: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882405.26282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882405.26285: variable 'omit' from source: magic vars 12081 1726882405.26418: variable 'ansible_distribution_major_version' from source: facts 12081 1726882405.26424: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882405.26428: variable 'omit' from source: magic vars 12081 1726882405.26447: variable 'omit' from source: magic vars 12081 1726882405.26618: variable 'controller_device' from source: play vars 12081 1726882405.26621: variable 'bond_opt' from source: unknown 12081 1726882405.26643: variable 'omit' from source: magic vars 12081 1726882405.26659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882405.26669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882405.26676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882405.26689: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882405.26691: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882405.26694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882405.26784: Set connection var ansible_pipelining to False 12081 1726882405.26787: Set connection var ansible_shell_type to sh 12081 1726882405.26794: Set connection var ansible_shell_executable to /bin/sh 12081 1726882405.26796: Set connection var ansible_connection to ssh 12081 1726882405.26802: Set connection var ansible_timeout to 10 12081 1726882405.26807: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882405.26838: variable 'ansible_shell_executable' from source: unknown 12081 1726882405.26841: variable 'ansible_connection' from source: unknown 12081 1726882405.26844: variable 'ansible_module_compression' from source: unknown 12081 1726882405.26846: variable 'ansible_shell_type' from source: unknown 12081 1726882405.26848: variable 'ansible_shell_executable' from source: unknown 12081 1726882405.26850: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882405.26855: variable 'ansible_pipelining' from source: unknown 12081 1726882405.26858: variable 'ansible_timeout' from source: unknown 12081 1726882405.26860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882405.26962: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882405.26972: variable 'omit' from source: magic vars 12081 1726882405.26975: starting attempt loop 12081 1726882405.26977: running the handler 12081 1726882405.26985: _low_level_execute_command(): starting 12081 1726882405.26987: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882405.27965: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.27974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.27984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.27996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.28037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.28044: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.28057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.28068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.28076: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.28083: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.28091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.28100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.28112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.28119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.28125: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.28140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.28214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.28233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.28249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.28379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.29953: stdout chunk (state=3): >>>/root <<< 12081 1726882405.30123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.30126: stdout chunk (state=3): >>><<< 12081 1726882405.30133: stderr chunk (state=3): >>><<< 12081 1726882405.30150: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.30158: _low_level_execute_command(): starting 12081 1726882405.30168: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040 `" && echo ansible-tmp-1726882405.3014915-13112-176664203843040="` echo /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040 `" ) && sleep 0' 12081 1726882405.30774: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.30783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.30794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.30810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.30849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.30857: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.30866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.30880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.30888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.30895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.30903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.30913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.30924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.30931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.30938: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.30947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.31018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.31035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.31043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.31190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.33058: stdout chunk (state=3): >>>ansible-tmp-1726882405.3014915-13112-176664203843040=/root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040 <<< 12081 1726882405.33245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.33249: stderr chunk (state=3): >>><<< 12081 1726882405.33254: stdout chunk (state=3): >>><<< 12081 1726882405.33273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882405.3014915-13112-176664203843040=/root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.33297: variable 'ansible_module_compression' from source: unknown 12081 1726882405.33336: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882405.33355: variable 'ansible_facts' from source: unknown 12081 1726882405.33414: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040/AnsiballZ_command.py 12081 1726882405.33543: Sending initial data 12081 1726882405.33546: Sent initial data (156 bytes) 12081 1726882405.34576: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.34589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.34598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.34610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.34646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.34656: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.34662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.34677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.34684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.34694: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.34701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.34709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.34720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.34726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.34732: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.34740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.34821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.34836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.34839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.34973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.36697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882405.36807: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882405.36914: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpnebnbnch /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040/AnsiballZ_command.py <<< 12081 1726882405.37005: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882405.39104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.39385: stderr chunk (state=3): >>><<< 12081 1726882405.39391: stdout chunk (state=3): >>><<< 12081 1726882405.39395: done transferring module to remote 12081 1726882405.39400: _low_level_execute_command(): starting 12081 1726882405.39405: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040/ /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040/AnsiballZ_command.py && sleep 0' 12081 1726882405.40540: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.40544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.40591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.40595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.40597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882405.40599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.40656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.40672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.40786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.42529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.42659: stderr chunk (state=3): >>><<< 12081 1726882405.42665: stdout chunk (state=3): >>><<< 12081 1726882405.42759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.42762: _low_level_execute_command(): starting 12081 1726882405.42766: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040/AnsiballZ_command.py && sleep 0' 12081 1726882405.43611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.43625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.43638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.43656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.43700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.43711: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.43723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.43738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.43748: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.43765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.43779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.43793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.43810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.43822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.43833: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.43847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.43928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.43945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.43965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.44106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.57820: stdout chunk (state=3): >>> {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 21:33:25.573664", "end": "2024-09-20 21:33:25.576720", "delta": "0:00:00.003056", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882405.59119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882405.59123: stdout chunk (state=3): >>><<< 12081 1726882405.59126: stderr chunk (state=3): >>><<< 12081 1726882405.59265: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 21:33:25.573664", "end": "2024-09-20 21:33:25.576720", "delta": "0:00:00.003056", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882405.59274: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_system', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882405.59277: _low_level_execute_command(): starting 12081 1726882405.59279: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882405.3014915-13112-176664203843040/ > /dev/null 2>&1 && sleep 0' 12081 1726882405.60998: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.61003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.61155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882405.61159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.61162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.61218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.61349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.61359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.61472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.63297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.63370: stderr chunk (state=3): >>><<< 12081 1726882405.63383: stdout chunk (state=3): >>><<< 12081 1726882405.63670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.63673: handler run complete 12081 1726882405.63675: Evaluated conditional (False): False 12081 1726882405.63677: variable 'bond_opt' from source: unknown 12081 1726882405.63679: variable 'result' from source: unknown 12081 1726882405.63681: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882405.63683: attempt loop complete, returning result 12081 1726882405.63688: variable 'bond_opt' from source: unknown 12081 1726882405.63699: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_actor_system', 'value': '00:00:5e:00:53:5d'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_system", "value": "00:00:5e:00:53:5d" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_system" ], "delta": "0:00:00.003056", "end": "2024-09-20 21:33:25.576720", "rc": 0, "start": "2024-09-20 21:33:25.573664" } STDOUT: 00:00:5e:00:53:5d 12081 1726882405.63911: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882405.63923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882405.63935: variable 'omit' from source: magic vars 12081 1726882405.64093: variable 'ansible_distribution_major_version' from source: facts 12081 1726882405.64103: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882405.64110: variable 'omit' from source: magic vars 12081 1726882405.64126: variable 'omit' from source: magic vars 12081 1726882405.64481: variable 'controller_device' from source: play vars 12081 1726882405.64491: variable 'bond_opt' from source: unknown 12081 1726882405.64514: variable 'omit' from source: magic vars 12081 1726882405.64539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882405.64554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882405.64685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882405.64703: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882405.64714: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882405.64722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882405.64806: Set connection var ansible_pipelining to False 12081 1726882405.64813: Set connection var ansible_shell_type to sh 12081 1726882405.64879: Set connection var ansible_shell_executable to /bin/sh 12081 1726882405.64887: Set connection var ansible_connection to ssh 12081 1726882405.64897: Set connection var ansible_timeout to 10 12081 1726882405.64907: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882405.64933: variable 'ansible_shell_executable' from source: unknown 12081 1726882405.65075: variable 'ansible_connection' from source: unknown 12081 1726882405.65083: variable 'ansible_module_compression' from source: unknown 12081 1726882405.65091: variable 'ansible_shell_type' from source: unknown 12081 1726882405.65098: variable 'ansible_shell_executable' from source: unknown 12081 1726882405.65105: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882405.65113: variable 'ansible_pipelining' from source: unknown 12081 1726882405.65120: variable 'ansible_timeout' from source: unknown 12081 1726882405.65127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882405.65223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882405.65241: variable 'omit' from source: magic vars 12081 1726882405.65275: starting attempt loop 12081 1726882405.65283: running the handler 12081 1726882405.65293: _low_level_execute_command(): starting 12081 1726882405.65476: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882405.67733: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.67737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.67887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882405.67890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.67893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.67998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.68019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.68032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.68168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.70094: stdout chunk (state=3): >>>/root <<< 12081 1726882405.70266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.70270: stdout chunk (state=3): >>><<< 12081 1726882405.70279: stderr chunk (state=3): >>><<< 12081 1726882405.70300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.70308: _low_level_execute_command(): starting 12081 1726882405.70313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973 `" && echo ansible-tmp-1726882405.7029905-13112-155630561189973="` echo /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973 `" ) && sleep 0' 12081 1726882405.71957: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.72023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.72036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.72049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.72089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.72096: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.72105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.72118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.72125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.72133: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.72144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.72157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.72168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.72175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.72182: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.72191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.72376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.72395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.72408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.72538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.74423: stdout chunk (state=3): >>>ansible-tmp-1726882405.7029905-13112-155630561189973=/root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973 <<< 12081 1726882405.74606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.74610: stdout chunk (state=3): >>><<< 12081 1726882405.74618: stderr chunk (state=3): >>><<< 12081 1726882405.74635: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882405.7029905-13112-155630561189973=/root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.74657: variable 'ansible_module_compression' from source: unknown 12081 1726882405.74702: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882405.74719: variable 'ansible_facts' from source: unknown 12081 1726882405.74788: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973/AnsiballZ_command.py 12081 1726882405.75242: Sending initial data 12081 1726882405.75245: Sent initial data (156 bytes) 12081 1726882405.77628: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882405.77719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.77730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.77745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.77785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.77792: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882405.77802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.77817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882405.77828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882405.77835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882405.77843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.77855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882405.77866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.77875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882405.77882: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882405.77940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.78012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.78157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.78170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.78297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.80045: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882405.80148: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882405.80253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpuvutxa56 /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973/AnsiballZ_command.py <<< 12081 1726882405.80349: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882405.81704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.81872: stderr chunk (state=3): >>><<< 12081 1726882405.81876: stdout chunk (state=3): >>><<< 12081 1726882405.81983: done transferring module to remote 12081 1726882405.81986: _low_level_execute_command(): starting 12081 1726882405.81989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973/ /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973/AnsiballZ_command.py && sleep 0' 12081 1726882405.84118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.84122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.84163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882405.84170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.84172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.84356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.84359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882405.84369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.84473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882405.86229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882405.86309: stderr chunk (state=3): >>><<< 12081 1726882405.86312: stdout chunk (state=3): >>><<< 12081 1726882405.86371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882405.86374: _low_level_execute_command(): starting 12081 1726882405.86376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973/AnsiballZ_command.py && sleep 0' 12081 1726882405.88070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882405.88080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.88113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882405.88116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882405.88123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882405.88125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882405.88271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882405.88283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882405.88410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.01809: stdout chunk (state=3): >>> {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 21:33:26.013433", "end": "2024-09-20 21:33:26.016632", "delta": "0:00:00.003199", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882406.02968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882406.03043: stderr chunk (state=3): >>><<< 12081 1726882406.03047: stdout chunk (state=3): >>><<< 12081 1726882406.03189: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 21:33:26.013433", "end": "2024-09-20 21:33:26.016632", "delta": "0:00:00.003199", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882406.03193: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_select', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882406.03195: _low_level_execute_command(): starting 12081 1726882406.03197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882405.7029905-13112-155630561189973/ > /dev/null 2>&1 && sleep 0' 12081 1726882406.04608: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.04613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.04639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.04642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.04644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.04819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.04888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.04891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.05006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.06840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.06927: stderr chunk (state=3): >>><<< 12081 1726882406.06930: stdout chunk (state=3): >>><<< 12081 1726882406.07074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.07078: handler run complete 12081 1726882406.07080: Evaluated conditional (False): False 12081 1726882406.07181: variable 'bond_opt' from source: unknown 12081 1726882406.07184: variable 'result' from source: unknown 12081 1726882406.07186: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882406.07198: attempt loop complete, returning result 12081 1726882406.07221: variable 'bond_opt' from source: unknown 12081 1726882406.07297: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_select', 'value': 'stable'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_select", "value": "stable" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_select" ], "delta": "0:00:00.003199", "end": "2024-09-20 21:33:26.016632", "rc": 0, "start": "2024-09-20 21:33:26.013433" } STDOUT: stable 0 12081 1726882406.07524: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.07536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.07549: variable 'omit' from source: magic vars 12081 1726882406.07714: variable 'ansible_distribution_major_version' from source: facts 12081 1726882406.07781: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882406.07790: variable 'omit' from source: magic vars 12081 1726882406.07808: variable 'omit' from source: magic vars 12081 1726882406.08079: variable 'controller_device' from source: play vars 12081 1726882406.08176: variable 'bond_opt' from source: unknown 12081 1726882406.08199: variable 'omit' from source: magic vars 12081 1726882406.08225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882406.08337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882406.08350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882406.08377: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882406.08386: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.08393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.08571: Set connection var ansible_pipelining to False 12081 1726882406.08579: Set connection var ansible_shell_type to sh 12081 1726882406.08592: Set connection var ansible_shell_executable to /bin/sh 12081 1726882406.08659: Set connection var ansible_connection to ssh 12081 1726882406.08675: Set connection var ansible_timeout to 10 12081 1726882406.08685: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882406.08713: variable 'ansible_shell_executable' from source: unknown 12081 1726882406.08767: variable 'ansible_connection' from source: unknown 12081 1726882406.08776: variable 'ansible_module_compression' from source: unknown 12081 1726882406.08784: variable 'ansible_shell_type' from source: unknown 12081 1726882406.08790: variable 'ansible_shell_executable' from source: unknown 12081 1726882406.08797: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.08804: variable 'ansible_pipelining' from source: unknown 12081 1726882406.08811: variable 'ansible_timeout' from source: unknown 12081 1726882406.08819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.09039: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882406.09056: variable 'omit' from source: magic vars 12081 1726882406.09075: starting attempt loop 12081 1726882406.09093: running the handler 12081 1726882406.09104: _low_level_execute_command(): starting 12081 1726882406.09205: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882406.10945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.10949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.10983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882406.10987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.10990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.11169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.11191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.11194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.11297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.12885: stdout chunk (state=3): >>>/root <<< 12081 1726882406.12984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.13063: stderr chunk (state=3): >>><<< 12081 1726882406.13069: stdout chunk (state=3): >>><<< 12081 1726882406.13172: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.13175: _low_level_execute_command(): starting 12081 1726882406.13178: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404 `" && echo ansible-tmp-1726882406.1308575-13112-33918466761404="` echo /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404 `" ) && sleep 0' 12081 1726882406.14849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.14856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.14889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882406.14892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.14894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882406.14896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.14950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.15613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.15616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.15736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.17668: stdout chunk (state=3): >>>ansible-tmp-1726882406.1308575-13112-33918466761404=/root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404 <<< 12081 1726882406.17768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.17847: stderr chunk (state=3): >>><<< 12081 1726882406.17850: stdout chunk (state=3): >>><<< 12081 1726882406.18105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882406.1308575-13112-33918466761404=/root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.18109: variable 'ansible_module_compression' from source: unknown 12081 1726882406.18111: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882406.18113: variable 'ansible_facts' from source: unknown 12081 1726882406.18116: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404/AnsiballZ_command.py 12081 1726882406.18425: Sending initial data 12081 1726882406.18429: Sent initial data (155 bytes) 12081 1726882406.20797: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.20961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.20979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.20997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.21042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.21062: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.21082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.21099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.21110: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.21120: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.21131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.21143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.21163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.21177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.21187: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.21199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.21393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.21409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.21422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.21608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.23358: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882406.23456: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882406.23557: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp8vlexvdq /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404/AnsiballZ_command.py <<< 12081 1726882406.23649: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882406.25121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.25204: stderr chunk (state=3): >>><<< 12081 1726882406.25208: stdout chunk (state=3): >>><<< 12081 1726882406.25228: done transferring module to remote 12081 1726882406.25236: _low_level_execute_command(): starting 12081 1726882406.25241: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404/ /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404/AnsiballZ_command.py && sleep 0' 12081 1726882406.26723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.26845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.26857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.26869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.26907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.26947: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.26957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.26972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.26980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.26986: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.26993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.27002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.27012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.27056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.27068: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.27071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.27136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.27286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.27299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.27429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.29272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.29276: stdout chunk (state=3): >>><<< 12081 1726882406.29283: stderr chunk (state=3): >>><<< 12081 1726882406.29301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.29304: _low_level_execute_command(): starting 12081 1726882406.29307: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404/AnsiballZ_command.py && sleep 0' 12081 1726882406.31156: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.31323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.31333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.31348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.31394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.31401: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.31412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.31430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.31438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.31445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.31455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.31462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.31477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.31485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.31492: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.31501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.31577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.31658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.31665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.31803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.45206: stdout chunk (state=3): >>> {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 21:33:26.447756", "end": "2024-09-20 21:33:26.450643", "delta": "0:00:00.002887", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882406.46408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882406.46412: stdout chunk (state=3): >>><<< 12081 1726882406.46417: stderr chunk (state=3): >>><<< 12081 1726882406.46439: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 21:33:26.447756", "end": "2024-09-20 21:33:26.450643", "delta": "0:00:00.002887", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882406.46469: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_user_port_key', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882406.46474: _low_level_execute_command(): starting 12081 1726882406.46479: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882406.1308575-13112-33918466761404/ > /dev/null 2>&1 && sleep 0' 12081 1726882406.48921: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.48931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.48937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.48954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.48998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.49008: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.49018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.49031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.49040: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.49044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.49055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.49062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.49076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.49083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.49090: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.49099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.49179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.49198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.49209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.49345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.51254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.51259: stdout chunk (state=3): >>><<< 12081 1726882406.51261: stderr chunk (state=3): >>><<< 12081 1726882406.51284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.51287: handler run complete 12081 1726882406.51312: Evaluated conditional (False): False 12081 1726882406.51462: variable 'bond_opt' from source: unknown 12081 1726882406.51474: variable 'result' from source: unknown 12081 1726882406.51488: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882406.51499: attempt loop complete, returning result 12081 1726882406.51518: variable 'bond_opt' from source: unknown 12081 1726882406.51589: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_user_port_key', 'value': '1023'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_user_port_key", "value": "1023" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key" ], "delta": "0:00:00.002887", "end": "2024-09-20 21:33:26.450643", "rc": 0, "start": "2024-09-20 21:33:26.447756" } STDOUT: 1023 12081 1726882406.51737: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.51740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.51743: variable 'omit' from source: magic vars 12081 1726882406.51858: variable 'ansible_distribution_major_version' from source: facts 12081 1726882406.51862: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882406.51869: variable 'omit' from source: magic vars 12081 1726882406.51885: variable 'omit' from source: magic vars 12081 1726882406.52269: variable 'controller_device' from source: play vars 12081 1726882406.52273: variable 'bond_opt' from source: unknown 12081 1726882406.52291: variable 'omit' from source: magic vars 12081 1726882406.52311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882406.52320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882406.52326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882406.52456: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882406.52460: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.52462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.52536: Set connection var ansible_pipelining to False 12081 1726882406.52540: Set connection var ansible_shell_type to sh 12081 1726882406.52545: Set connection var ansible_shell_executable to /bin/sh 12081 1726882406.52548: Set connection var ansible_connection to ssh 12081 1726882406.52670: Set connection var ansible_timeout to 10 12081 1726882406.52677: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882406.52699: variable 'ansible_shell_executable' from source: unknown 12081 1726882406.52702: variable 'ansible_connection' from source: unknown 12081 1726882406.52705: variable 'ansible_module_compression' from source: unknown 12081 1726882406.52707: variable 'ansible_shell_type' from source: unknown 12081 1726882406.52710: variable 'ansible_shell_executable' from source: unknown 12081 1726882406.52712: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.52714: variable 'ansible_pipelining' from source: unknown 12081 1726882406.52718: variable 'ansible_timeout' from source: unknown 12081 1726882406.52722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.52927: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882406.52935: variable 'omit' from source: magic vars 12081 1726882406.52938: starting attempt loop 12081 1726882406.52940: running the handler 12081 1726882406.52949: _low_level_execute_command(): starting 12081 1726882406.52954: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882406.54767: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.54776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.54911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882406.54915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882406.55046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.55050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.55141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.55160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.55373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.56889: stdout chunk (state=3): >>>/root <<< 12081 1726882406.57055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.57059: stderr chunk (state=3): >>><<< 12081 1726882406.57062: stdout chunk (state=3): >>><<< 12081 1726882406.57091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.57094: _low_level_execute_command(): starting 12081 1726882406.57097: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629 `" && echo ansible-tmp-1726882406.5708413-13112-84912048727629="` echo /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629 `" ) && sleep 0' 12081 1726882406.58415: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.58420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.58580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.58585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.58602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.58608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.58732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.58737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.58757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.58900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.60783: stdout chunk (state=3): >>>ansible-tmp-1726882406.5708413-13112-84912048727629=/root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629 <<< 12081 1726882406.60954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.60959: stderr chunk (state=3): >>><<< 12081 1726882406.60962: stdout chunk (state=3): >>><<< 12081 1726882406.60980: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882406.5708413-13112-84912048727629=/root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.61005: variable 'ansible_module_compression' from source: unknown 12081 1726882406.61046: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882406.61066: variable 'ansible_facts' from source: unknown 12081 1726882406.61127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629/AnsiballZ_command.py 12081 1726882406.61787: Sending initial data 12081 1726882406.61791: Sent initial data (155 bytes) 12081 1726882406.63687: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.63759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.63772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.63786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.63826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.63832: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.63862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.63877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.63903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.63911: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.63917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.63926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.63938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.63971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.63978: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.63987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.64054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.64201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.64213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.64339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.66085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882406.66184: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882406.66289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp487gucl5 /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629/AnsiballZ_command.py <<< 12081 1726882406.66389: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882406.67742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.67970: stderr chunk (state=3): >>><<< 12081 1726882406.67973: stdout chunk (state=3): >>><<< 12081 1726882406.67975: done transferring module to remote 12081 1726882406.67978: _low_level_execute_command(): starting 12081 1726882406.67980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629/ /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629/AnsiballZ_command.py && sleep 0' 12081 1726882406.68728: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.68742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.68771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.68791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.68835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.68847: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.68873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.68891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.68903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.68914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.68927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.68940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.68959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.68976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.68989: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.69002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.69083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.69105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.69120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.69323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.71148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.71154: stdout chunk (state=3): >>><<< 12081 1726882406.71157: stderr chunk (state=3): >>><<< 12081 1726882406.71254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.71258: _low_level_execute_command(): starting 12081 1726882406.71261: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629/AnsiballZ_command.py && sleep 0' 12081 1726882406.71858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.71874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.71887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.71901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.71945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.71958: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.71973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.71988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.71997: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.72005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.72014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.72036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.72050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.72063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.72077: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.72087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.72171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.72187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.72199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.72333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.85688: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 21:33:26.852413", "end": "2024-09-20 21:33:26.855432", "delta": "0:00:00.003019", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882406.86877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882406.86955: stderr chunk (state=3): >>><<< 12081 1726882406.86959: stdout chunk (state=3): >>><<< 12081 1726882406.87092: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 21:33:26.852413", "end": "2024-09-20 21:33:26.855432", "delta": "0:00:00.003019", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882406.87096: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/all_slaves_active', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882406.87099: _low_level_execute_command(): starting 12081 1726882406.87101: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882406.5708413-13112-84912048727629/ > /dev/null 2>&1 && sleep 0' 12081 1726882406.88677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.88696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.88712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.88731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.88780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.88793: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.88809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.88826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.88840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.88855: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.88871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.88886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.88903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.88916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.88929: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.88942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.89224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.89242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.89260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.89893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.91805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.91809: stdout chunk (state=3): >>><<< 12081 1726882406.92268: stderr chunk (state=3): >>><<< 12081 1726882406.92272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.92278: handler run complete 12081 1726882406.92280: Evaluated conditional (False): False 12081 1726882406.92283: variable 'bond_opt' from source: unknown 12081 1726882406.92285: variable 'result' from source: unknown 12081 1726882406.92287: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882406.92289: attempt loop complete, returning result 12081 1726882406.92291: variable 'bond_opt' from source: unknown 12081 1726882406.92293: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'all_slaves_active', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "all_slaves_active", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/all_slaves_active" ], "delta": "0:00:00.003019", "end": "2024-09-20 21:33:26.855432", "rc": 0, "start": "2024-09-20 21:33:26.852413" } STDOUT: 1 12081 1726882406.92405: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.92409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.92412: variable 'omit' from source: magic vars 12081 1726882406.92568: variable 'ansible_distribution_major_version' from source: facts 12081 1726882406.92581: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882406.92590: variable 'omit' from source: magic vars 12081 1726882406.92609: variable 'omit' from source: magic vars 12081 1726882406.92778: variable 'controller_device' from source: play vars 12081 1726882406.92788: variable 'bond_opt' from source: unknown 12081 1726882406.92815: variable 'omit' from source: magic vars 12081 1726882406.92845: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882406.92862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882406.92877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882406.92894: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882406.92902: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.92908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.92995: Set connection var ansible_pipelining to False 12081 1726882406.93003: Set connection var ansible_shell_type to sh 12081 1726882406.93015: Set connection var ansible_shell_executable to /bin/sh 12081 1726882406.93024: Set connection var ansible_connection to ssh 12081 1726882406.93035: Set connection var ansible_timeout to 10 12081 1726882406.93044: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882406.93076: variable 'ansible_shell_executable' from source: unknown 12081 1726882406.93084: variable 'ansible_connection' from source: unknown 12081 1726882406.93090: variable 'ansible_module_compression' from source: unknown 12081 1726882406.93096: variable 'ansible_shell_type' from source: unknown 12081 1726882406.93102: variable 'ansible_shell_executable' from source: unknown 12081 1726882406.93109: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882406.93116: variable 'ansible_pipelining' from source: unknown 12081 1726882406.93122: variable 'ansible_timeout' from source: unknown 12081 1726882406.93129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882406.93227: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882406.93240: variable 'omit' from source: magic vars 12081 1726882406.93248: starting attempt loop 12081 1726882406.93259: running the handler 12081 1726882406.93271: _low_level_execute_command(): starting 12081 1726882406.93279: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882406.93935: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.93954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.93972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.93992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.94036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.94049: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.94069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.94088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.94101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.94113: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.94125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.94139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.94158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.94175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.94187: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.94201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.94284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.94306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.94323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.94459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.96041: stdout chunk (state=3): >>>/root <<< 12081 1726882406.96270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.96274: stdout chunk (state=3): >>><<< 12081 1726882406.96276: stderr chunk (state=3): >>><<< 12081 1726882406.96279: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.96281: _low_level_execute_command(): starting 12081 1726882406.96362: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117 `" && echo ansible-tmp-1726882406.962485-13112-274207531267117="` echo /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117 `" ) && sleep 0' 12081 1726882406.96951: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882406.96972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.96987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.97005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.97050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.97072: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882406.97087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.97104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882406.97116: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882406.97132: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882406.97145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882406.97162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882406.97182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882406.97194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882406.97205: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882406.97219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882406.97301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882406.97323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882406.97343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882406.97576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882406.99349: stdout chunk (state=3): >>>ansible-tmp-1726882406.962485-13112-274207531267117=/root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117 <<< 12081 1726882406.99524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882406.99556: stderr chunk (state=3): >>><<< 12081 1726882406.99559: stdout chunk (state=3): >>><<< 12081 1726882406.99675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882406.962485-13112-274207531267117=/root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882406.99678: variable 'ansible_module_compression' from source: unknown 12081 1726882406.99681: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882406.99683: variable 'ansible_facts' from source: unknown 12081 1726882406.99783: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117/AnsiballZ_command.py 12081 1726882406.99969: Sending initial data 12081 1726882406.99972: Sent initial data (155 bytes) 12081 1726882407.01489: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.01505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.01520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.01543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.01594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.01607: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.01622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.01640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.01663: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.01679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.01693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.01707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.01723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.01736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.01747: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.01770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.01848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.01872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.01893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.02111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.03826: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882407.03918: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882407.04026: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpz118km34 /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117/AnsiballZ_command.py <<< 12081 1726882407.04125: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882407.05956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.06047: stderr chunk (state=3): >>><<< 12081 1726882407.06050: stdout chunk (state=3): >>><<< 12081 1726882407.06077: done transferring module to remote 12081 1726882407.06086: _low_level_execute_command(): starting 12081 1726882407.06090: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117/ /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117/AnsiballZ_command.py && sleep 0' 12081 1726882407.07038: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.07048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.07062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.07079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.07117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.07123: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.07132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.07155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.07188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.07195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.07203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.07480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.07493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.07504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.07507: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.07516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.07592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.07612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.07621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.07747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.09561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.09567: stdout chunk (state=3): >>><<< 12081 1726882407.09573: stderr chunk (state=3): >>><<< 12081 1726882407.09596: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.09599: _low_level_execute_command(): starting 12081 1726882407.09603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117/AnsiballZ_command.py && sleep 0' 12081 1726882407.10503: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.10514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.10532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.10546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.10588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.10596: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.10605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.10620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.10636: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.10643: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.10651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.10669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.10681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.10689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.10695: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.10705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.10801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.10818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.10830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.10984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.24320: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 21:33:27.238919", "end": "2024-09-20 21:33:27.241790", "delta": "0:00:00.002871", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882407.25486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882407.25490: stdout chunk (state=3): >>><<< 12081 1726882407.25496: stderr chunk (state=3): >>><<< 12081 1726882407.25521: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 21:33:27.238919", "end": "2024-09-20 21:33:27.241790", "delta": "0:00:00.002871", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882407.25550: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/downdelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882407.25557: _low_level_execute_command(): starting 12081 1726882407.25564: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882406.962485-13112-274207531267117/ > /dev/null 2>&1 && sleep 0' 12081 1726882407.26306: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.26319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.26330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.26344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.26388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.26397: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.26412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.26430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.26438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.26445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.26453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.26467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.26482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.26490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.26497: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.26507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.26589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.26607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.26621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.26762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.28629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.28632: stdout chunk (state=3): >>><<< 12081 1726882407.28635: stderr chunk (state=3): >>><<< 12081 1726882407.28638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.28645: handler run complete 12081 1726882407.29168: Evaluated conditional (False): False 12081 1726882407.29172: variable 'bond_opt' from source: unknown 12081 1726882407.29174: variable 'result' from source: unknown 12081 1726882407.29176: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882407.29178: attempt loop complete, returning result 12081 1726882407.29180: variable 'bond_opt' from source: unknown 12081 1726882407.29182: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'downdelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "downdelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/downdelay" ], "delta": "0:00:00.002871", "end": "2024-09-20 21:33:27.241790", "rc": 0, "start": "2024-09-20 21:33:27.238919" } STDOUT: 0 12081 1726882407.29289: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882407.29292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882407.29295: variable 'omit' from source: magic vars 12081 1726882407.29297: variable 'ansible_distribution_major_version' from source: facts 12081 1726882407.29299: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882407.29301: variable 'omit' from source: magic vars 12081 1726882407.29303: variable 'omit' from source: magic vars 12081 1726882407.29676: variable 'controller_device' from source: play vars 12081 1726882407.29679: variable 'bond_opt' from source: unknown 12081 1726882407.29696: variable 'omit' from source: magic vars 12081 1726882407.29716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882407.29812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882407.29820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882407.29889: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882407.29893: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882407.29895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882407.30006: Set connection var ansible_pipelining to False 12081 1726882407.30009: Set connection var ansible_shell_type to sh 12081 1726882407.30016: Set connection var ansible_shell_executable to /bin/sh 12081 1726882407.30018: Set connection var ansible_connection to ssh 12081 1726882407.30023: Set connection var ansible_timeout to 10 12081 1726882407.30028: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882407.30057: variable 'ansible_shell_executable' from source: unknown 12081 1726882407.30065: variable 'ansible_connection' from source: unknown 12081 1726882407.30068: variable 'ansible_module_compression' from source: unknown 12081 1726882407.30071: variable 'ansible_shell_type' from source: unknown 12081 1726882407.30073: variable 'ansible_shell_executable' from source: unknown 12081 1726882407.30075: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882407.30081: variable 'ansible_pipelining' from source: unknown 12081 1726882407.30083: variable 'ansible_timeout' from source: unknown 12081 1726882407.30085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882407.30206: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882407.30214: variable 'omit' from source: magic vars 12081 1726882407.30217: starting attempt loop 12081 1726882407.30220: running the handler 12081 1726882407.30225: _low_level_execute_command(): starting 12081 1726882407.30229: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882407.30895: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.30905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.30917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.30932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.30975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.30983: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.30992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.31005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.31013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.31019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.31028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.31036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.31049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.31060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.31069: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.31079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.31148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.31172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.31184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.31307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.32869: stdout chunk (state=3): >>>/root <<< 12081 1726882407.33026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.33030: stdout chunk (state=3): >>><<< 12081 1726882407.33036: stderr chunk (state=3): >>><<< 12081 1726882407.33051: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.33060: _low_level_execute_command(): starting 12081 1726882407.33068: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533 `" && echo ansible-tmp-1726882407.3305051-13112-261673982669533="` echo /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533 `" ) && sleep 0' 12081 1726882407.34234: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.34238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.34285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882407.34292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.34306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882407.34311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.34324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882407.34329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.34411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.34415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.34428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.34707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.36409: stdout chunk (state=3): >>>ansible-tmp-1726882407.3305051-13112-261673982669533=/root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533 <<< 12081 1726882407.36580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.36584: stderr chunk (state=3): >>><<< 12081 1726882407.36587: stdout chunk (state=3): >>><<< 12081 1726882407.36606: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882407.3305051-13112-261673982669533=/root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.36628: variable 'ansible_module_compression' from source: unknown 12081 1726882407.36676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882407.36694: variable 'ansible_facts' from source: unknown 12081 1726882407.36761: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533/AnsiballZ_command.py 12081 1726882407.37515: Sending initial data 12081 1726882407.37519: Sent initial data (156 bytes) 12081 1726882407.39103: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.39107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.40008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.40014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.40029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882407.40034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.40142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.40158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.40237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.41965: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882407.42058: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882407.42166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpxud_iyp9 /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533/AnsiballZ_command.py <<< 12081 1726882407.42269: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882407.43904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.43993: stderr chunk (state=3): >>><<< 12081 1726882407.43997: stdout chunk (state=3): >>><<< 12081 1726882407.44016: done transferring module to remote 12081 1726882407.44024: _low_level_execute_command(): starting 12081 1726882407.44029: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533/ /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533/AnsiballZ_command.py && sleep 0' 12081 1726882407.45943: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.45949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.45991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882407.45994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882407.46007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.46013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.46018: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.46032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.46104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.46118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.46123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.46257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.48050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.48057: stderr chunk (state=3): >>><<< 12081 1726882407.48059: stdout chunk (state=3): >>><<< 12081 1726882407.48078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.48081: _low_level_execute_command(): starting 12081 1726882407.48086: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533/AnsiballZ_command.py && sleep 0' 12081 1726882407.49298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.49306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.49317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.49331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.49482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.49489: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.49499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.49513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.49520: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.49527: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.49534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.49544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.49557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.49563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.49573: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.49592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.49667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.49706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.49713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.49930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.63123: stdout chunk (state=3): >>> {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 21:33:27.627008", "end": "2024-09-20 21:33:27.629910", "delta": "0:00:00.002902", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882407.64286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882407.64338: stderr chunk (state=3): >>><<< 12081 1726882407.64342: stdout chunk (state=3): >>><<< 12081 1726882407.64360: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 21:33:27.627008", "end": "2024-09-20 21:33:27.629910", "delta": "0:00:00.002902", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882407.64393: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lacp_rate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882407.64400: _low_level_execute_command(): starting 12081 1726882407.64405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882407.3305051-13112-261673982669533/ > /dev/null 2>&1 && sleep 0' 12081 1726882407.65061: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.65074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.65084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.65098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.65140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.65147: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.65158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.65174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.65181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.65188: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.65196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.65205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.65216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.65223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.65233: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.65244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.65316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.65334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.65348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.65479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.67483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.67515: stderr chunk (state=3): >>><<< 12081 1726882407.67518: stdout chunk (state=3): >>><<< 12081 1726882407.67568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.67576: handler run complete 12081 1726882407.67578: Evaluated conditional (False): False 12081 1726882407.67870: variable 'bond_opt' from source: unknown 12081 1726882407.67873: variable 'result' from source: unknown 12081 1726882407.67876: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882407.67878: attempt loop complete, returning result 12081 1726882407.67880: variable 'bond_opt' from source: unknown 12081 1726882407.67882: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'lacp_rate', 'value': 'slow'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lacp_rate", "value": "slow" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lacp_rate" ], "delta": "0:00:00.002902", "end": "2024-09-20 21:33:27.629910", "rc": 0, "start": "2024-09-20 21:33:27.627008" } STDOUT: slow 0 12081 1726882407.68085: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882407.68099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882407.68119: variable 'omit' from source: magic vars 12081 1726882407.68288: variable 'ansible_distribution_major_version' from source: facts 12081 1726882407.68299: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882407.68307: variable 'omit' from source: magic vars 12081 1726882407.68325: variable 'omit' from source: magic vars 12081 1726882407.68513: variable 'controller_device' from source: play vars 12081 1726882407.68523: variable 'bond_opt' from source: unknown 12081 1726882407.68547: variable 'omit' from source: magic vars 12081 1726882407.68581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882407.68595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882407.68606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882407.68623: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882407.68630: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882407.68637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882407.68732: Set connection var ansible_pipelining to False 12081 1726882407.68740: Set connection var ansible_shell_type to sh 12081 1726882407.68755: Set connection var ansible_shell_executable to /bin/sh 12081 1726882407.68762: Set connection var ansible_connection to ssh 12081 1726882407.68778: Set connection var ansible_timeout to 10 12081 1726882407.68789: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882407.68815: variable 'ansible_shell_executable' from source: unknown 12081 1726882407.68823: variable 'ansible_connection' from source: unknown 12081 1726882407.68830: variable 'ansible_module_compression' from source: unknown 12081 1726882407.68836: variable 'ansible_shell_type' from source: unknown 12081 1726882407.68842: variable 'ansible_shell_executable' from source: unknown 12081 1726882407.68849: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882407.68859: variable 'ansible_pipelining' from source: unknown 12081 1726882407.68869: variable 'ansible_timeout' from source: unknown 12081 1726882407.68884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882407.68980: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882407.68999: variable 'omit' from source: magic vars 12081 1726882407.69008: starting attempt loop 12081 1726882407.69014: running the handler 12081 1726882407.69025: _low_level_execute_command(): starting 12081 1726882407.69032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882407.69733: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.69756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.69775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.69794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.69836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.69849: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.69876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.69895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.69908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.69919: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.69933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.69948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.69971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.69989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.70001: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.70015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.70103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.70126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.70143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.70278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.71840: stdout chunk (state=3): >>>/root <<< 12081 1726882407.71945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.72029: stderr chunk (state=3): >>><<< 12081 1726882407.72033: stdout chunk (state=3): >>><<< 12081 1726882407.72050: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.72059: _low_level_execute_command(): starting 12081 1726882407.72071: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933 `" && echo ansible-tmp-1726882407.7204993-13112-76184532493933="` echo /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933 `" ) && sleep 0' 12081 1726882407.72678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.72686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.72697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.72710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.72755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.72758: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.72766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.72780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.72788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.72794: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.72801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.72810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.72821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.72829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.72835: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.72844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.72921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.72940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.72955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.73095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.74943: stdout chunk (state=3): >>>ansible-tmp-1726882407.7204993-13112-76184532493933=/root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933 <<< 12081 1726882407.75058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.75144: stderr chunk (state=3): >>><<< 12081 1726882407.75148: stdout chunk (state=3): >>><<< 12081 1726882407.75166: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882407.7204993-13112-76184532493933=/root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.75188: variable 'ansible_module_compression' from source: unknown 12081 1726882407.75228: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882407.75246: variable 'ansible_facts' from source: unknown 12081 1726882407.75314: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933/AnsiballZ_command.py 12081 1726882407.75441: Sending initial data 12081 1726882407.75444: Sent initial data (155 bytes) 12081 1726882407.76477: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882407.76481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.76484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.76486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.76488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.76490: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882407.76493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.76495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882407.76501: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882407.76612: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882407.76615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.76617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.76620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.76622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882407.76624: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882407.76625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.76627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.76640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.76655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.76784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.78518: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882407.78617: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882407.78719: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp_s4p6lwj /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933/AnsiballZ_command.py <<< 12081 1726882407.78812: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882407.80189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.80377: stderr chunk (state=3): >>><<< 12081 1726882407.80381: stdout chunk (state=3): >>><<< 12081 1726882407.80383: done transferring module to remote 12081 1726882407.80385: _low_level_execute_command(): starting 12081 1726882407.80389: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933/ /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933/AnsiballZ_command.py && sleep 0' 12081 1726882407.81527: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.81531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.81557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.81560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.81562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.81625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.82080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.82295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.84048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882407.84055: stdout chunk (state=3): >>><<< 12081 1726882407.84058: stderr chunk (state=3): >>><<< 12081 1726882407.84151: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882407.84157: _low_level_execute_command(): starting 12081 1726882407.84160: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933/AnsiballZ_command.py && sleep 0' 12081 1726882407.85534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882407.85538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882407.85589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882407.85594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882407.85596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882407.85669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882407.85672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882407.85679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882407.85800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882407.99213: stdout chunk (state=3): >>> {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 21:33:27.987571", "end": "2024-09-20 21:33:27.990649", "delta": "0:00:00.003078", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882408.00389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882408.00418: stderr chunk (state=3): >>><<< 12081 1726882408.00421: stdout chunk (state=3): >>><<< 12081 1726882408.00546: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 21:33:27.987571", "end": "2024-09-20 21:33:27.990649", "delta": "0:00:00.003078", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882408.00550: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882408.00557: _low_level_execute_command(): starting 12081 1726882408.00559: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882407.7204993-13112-76184532493933/ > /dev/null 2>&1 && sleep 0' 12081 1726882408.01948: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.02082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.02099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.02118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.02160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.02176: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.02192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.02210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.02222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.02234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.02246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.02260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.02384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.02398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.02411: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.02426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.02504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.02528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.02545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.02676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.04594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.04597: stdout chunk (state=3): >>><<< 12081 1726882408.04600: stderr chunk (state=3): >>><<< 12081 1726882408.05069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.05074: handler run complete 12081 1726882408.05077: Evaluated conditional (False): False 12081 1726882408.05080: variable 'bond_opt' from source: unknown 12081 1726882408.05082: variable 'result' from source: unknown 12081 1726882408.05085: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882408.05087: attempt loop complete, returning result 12081 1726882408.05090: variable 'bond_opt' from source: unknown 12081 1726882408.05092: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'lp_interval', 'value': '128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lp_interval", "value": "128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lp_interval" ], "delta": "0:00:00.003078", "end": "2024-09-20 21:33:27.990649", "rc": 0, "start": "2024-09-20 21:33:27.987571" } STDOUT: 128 12081 1726882408.05197: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.05200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.05203: variable 'omit' from source: magic vars 12081 1726882408.05310: variable 'ansible_distribution_major_version' from source: facts 12081 1726882408.05321: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882408.05328: variable 'omit' from source: magic vars 12081 1726882408.05345: variable 'omit' from source: magic vars 12081 1726882408.05608: variable 'controller_device' from source: play vars 12081 1726882408.05977: variable 'bond_opt' from source: unknown 12081 1726882408.06003: variable 'omit' from source: magic vars 12081 1726882408.06027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882408.06038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882408.06048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882408.06066: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882408.06073: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.06079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.06152: Set connection var ansible_pipelining to False 12081 1726882408.06160: Set connection var ansible_shell_type to sh 12081 1726882408.06177: Set connection var ansible_shell_executable to /bin/sh 12081 1726882408.06184: Set connection var ansible_connection to ssh 12081 1726882408.06193: Set connection var ansible_timeout to 10 12081 1726882408.06202: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882408.06229: variable 'ansible_shell_executable' from source: unknown 12081 1726882408.06236: variable 'ansible_connection' from source: unknown 12081 1726882408.06242: variable 'ansible_module_compression' from source: unknown 12081 1726882408.06248: variable 'ansible_shell_type' from source: unknown 12081 1726882408.06253: variable 'ansible_shell_executable' from source: unknown 12081 1726882408.06259: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.06269: variable 'ansible_pipelining' from source: unknown 12081 1726882408.06276: variable 'ansible_timeout' from source: unknown 12081 1726882408.06283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.06378: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882408.06392: variable 'omit' from source: magic vars 12081 1726882408.06576: starting attempt loop 12081 1726882408.06584: running the handler 12081 1726882408.06594: _low_level_execute_command(): starting 12081 1726882408.06607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882408.08458: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.08462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.08555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.08559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.08562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.08741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.08798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.08801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.08933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.10488: stdout chunk (state=3): >>>/root <<< 12081 1726882408.10699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.10703: stdout chunk (state=3): >>><<< 12081 1726882408.10706: stderr chunk (state=3): >>><<< 12081 1726882408.10806: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.10810: _low_level_execute_command(): starting 12081 1726882408.10812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160 `" && echo ansible-tmp-1726882408.1072283-13112-230466126366160="` echo /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160 `" ) && sleep 0' 12081 1726882408.11457: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.11492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.11516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.11536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.11889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882408.11892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.11895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.11960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.11974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.12096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.13957: stdout chunk (state=3): >>>ansible-tmp-1726882408.1072283-13112-230466126366160=/root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160 <<< 12081 1726882408.14136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.14221: stderr chunk (state=3): >>><<< 12081 1726882408.14233: stdout chunk (state=3): >>><<< 12081 1726882408.14476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882408.1072283-13112-230466126366160=/root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.14480: variable 'ansible_module_compression' from source: unknown 12081 1726882408.14483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882408.14485: variable 'ansible_facts' from source: unknown 12081 1726882408.14487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160/AnsiballZ_command.py 12081 1726882408.14547: Sending initial data 12081 1726882408.14550: Sent initial data (156 bytes) 12081 1726882408.15478: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.15490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.15502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.15518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.15558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.15577: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.15591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.15608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.15619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.15631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.15641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.15652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.15671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.15681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.15689: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.15700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.15775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.15806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.15831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.15969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.17714: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882408.17812: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882408.17975: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp_n5xly7c /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160/AnsiballZ_command.py <<< 12081 1726882408.18078: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882408.19436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.19639: stderr chunk (state=3): >>><<< 12081 1726882408.19642: stdout chunk (state=3): >>><<< 12081 1726882408.19645: done transferring module to remote 12081 1726882408.19647: _low_level_execute_command(): starting 12081 1726882408.19649: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160/ /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160/AnsiballZ_command.py && sleep 0' 12081 1726882408.20280: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.20294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.20311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.20337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.20383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.20396: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.20411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.20438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.20454: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.20469: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.20482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.20496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.20513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.20525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.20538: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.20562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.20639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.20675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.20693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.20824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.22570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.22635: stderr chunk (state=3): >>><<< 12081 1726882408.22638: stdout chunk (state=3): >>><<< 12081 1726882408.22722: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.22725: _low_level_execute_command(): starting 12081 1726882408.22728: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160/AnsiballZ_command.py && sleep 0' 12081 1726882408.23331: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.23347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.23367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.23388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.23430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.23444: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.23462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.23483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.23501: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.23513: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.23524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.23538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.23559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.23574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.23587: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.23601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.23680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.23703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.23719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.23860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.37267: stdout chunk (state=3): >>> {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 21:33:28.368445", "end": "2024-09-20 21:33:28.371222", "delta": "0:00:00.002777", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882408.38387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882408.38459: stderr chunk (state=3): >>><<< 12081 1726882408.38463: stdout chunk (state=3): >>><<< 12081 1726882408.38488: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 21:33:28.368445", "end": "2024-09-20 21:33:28.371222", "delta": "0:00:00.002777", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882408.38519: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/miimon', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882408.38524: _low_level_execute_command(): starting 12081 1726882408.38530: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882408.1072283-13112-230466126366160/ > /dev/null 2>&1 && sleep 0' 12081 1726882408.39180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.39189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.39198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.39211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.39253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.39266: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.39280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.39292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.39299: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.39305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.39313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.39321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.39331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.39338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.39344: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.39356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.39436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.39456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.39468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.39609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.41467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.41473: stdout chunk (state=3): >>><<< 12081 1726882408.41483: stderr chunk (state=3): >>><<< 12081 1726882408.41501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.41507: handler run complete 12081 1726882408.41525: Evaluated conditional (False): False 12081 1726882408.41683: variable 'bond_opt' from source: unknown 12081 1726882408.41687: variable 'result' from source: unknown 12081 1726882408.41702: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882408.41713: attempt loop complete, returning result 12081 1726882408.41732: variable 'bond_opt' from source: unknown 12081 1726882408.41809: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'miimon', 'value': '110'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "miimon", "value": "110" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/miimon" ], "delta": "0:00:00.002777", "end": "2024-09-20 21:33:28.371222", "rc": 0, "start": "2024-09-20 21:33:28.368445" } STDOUT: 110 12081 1726882408.41948: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.41954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.41958: variable 'omit' from source: magic vars 12081 1726882408.42102: variable 'ansible_distribution_major_version' from source: facts 12081 1726882408.42111: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882408.42114: variable 'omit' from source: magic vars 12081 1726882408.42130: variable 'omit' from source: magic vars 12081 1726882408.42302: variable 'controller_device' from source: play vars 12081 1726882408.42305: variable 'bond_opt' from source: unknown 12081 1726882408.42330: variable 'omit' from source: magic vars 12081 1726882408.42350: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882408.42357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882408.42365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882408.42380: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882408.42383: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.42385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.42473: Set connection var ansible_pipelining to False 12081 1726882408.42477: Set connection var ansible_shell_type to sh 12081 1726882408.42483: Set connection var ansible_shell_executable to /bin/sh 12081 1726882408.42486: Set connection var ansible_connection to ssh 12081 1726882408.42491: Set connection var ansible_timeout to 10 12081 1726882408.42496: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882408.42523: variable 'ansible_shell_executable' from source: unknown 12081 1726882408.42526: variable 'ansible_connection' from source: unknown 12081 1726882408.42529: variable 'ansible_module_compression' from source: unknown 12081 1726882408.42531: variable 'ansible_shell_type' from source: unknown 12081 1726882408.42533: variable 'ansible_shell_executable' from source: unknown 12081 1726882408.42535: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.42544: variable 'ansible_pipelining' from source: unknown 12081 1726882408.42547: variable 'ansible_timeout' from source: unknown 12081 1726882408.42549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.42656: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882408.42665: variable 'omit' from source: magic vars 12081 1726882408.42670: starting attempt loop 12081 1726882408.42672: running the handler 12081 1726882408.42679: _low_level_execute_command(): starting 12081 1726882408.42682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882408.43378: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.43393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.43405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.43421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.43460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.43470: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.43481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.43496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.43505: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.43512: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.43528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.43537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.43548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.43556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.43562: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.43575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.43650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.43671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.43684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.43811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.45378: stdout chunk (state=3): >>>/root <<< 12081 1726882408.45480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.45548: stderr chunk (state=3): >>><<< 12081 1726882408.45553: stdout chunk (state=3): >>><<< 12081 1726882408.45571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.45580: _low_level_execute_command(): starting 12081 1726882408.45585: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470 `" && echo ansible-tmp-1726882408.45571-13112-118292124469470="` echo /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470 `" ) && sleep 0' 12081 1726882408.46206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.46214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.46224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.46237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.46277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.46285: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.46292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.46305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.46312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.46318: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.46326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.46335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.46346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.46355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.46358: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.46373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.46442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.46459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.46474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.46599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.48479: stdout chunk (state=3): >>>ansible-tmp-1726882408.45571-13112-118292124469470=/root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470 <<< 12081 1726882408.48582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.48647: stderr chunk (state=3): >>><<< 12081 1726882408.48651: stdout chunk (state=3): >>><<< 12081 1726882408.48671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882408.45571-13112-118292124469470=/root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.48693: variable 'ansible_module_compression' from source: unknown 12081 1726882408.48729: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882408.48745: variable 'ansible_facts' from source: unknown 12081 1726882408.48820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470/AnsiballZ_command.py 12081 1726882408.48939: Sending initial data 12081 1726882408.48943: Sent initial data (154 bytes) 12081 1726882408.49872: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.49881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.49892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.49908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.49943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.49951: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.49961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.49977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.49985: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.49991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.49999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.50010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.50021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.50026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.50033: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.50042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.50113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.50133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.50144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.50272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.52020: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882408.52110: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882408.52214: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpjes6n0_q /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470/AnsiballZ_command.py <<< 12081 1726882408.52312: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882408.53798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.53802: stderr chunk (state=3): >>><<< 12081 1726882408.53804: stdout chunk (state=3): >>><<< 12081 1726882408.53827: done transferring module to remote 12081 1726882408.53834: _low_level_execute_command(): starting 12081 1726882408.53844: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470/ /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470/AnsiballZ_command.py && sleep 0' 12081 1726882408.54456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.54463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.54477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.54492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.54527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.54534: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.54544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.54557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.54567: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.54576: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.54585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.54591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.54603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.54610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.54616: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.54626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.54695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.54713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.54725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.54853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.56693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.56697: stdout chunk (state=3): >>><<< 12081 1726882408.56699: stderr chunk (state=3): >>><<< 12081 1726882408.56770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.56774: _low_level_execute_command(): starting 12081 1726882408.56777: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470/AnsiballZ_command.py && sleep 0' 12081 1726882408.57431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.57445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.57471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.57492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.57534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.57547: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.57568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.57593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.57606: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.57619: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.57631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.57645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.57668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.57682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.57700: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.57715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.57797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.57823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.57841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.57987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.71248: stdout chunk (state=3): >>> {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 21:33:28.708148", "end": "2024-09-20 21:33:28.711086", "delta": "0:00:00.002938", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882408.72385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882408.72418: stderr chunk (state=3): >>><<< 12081 1726882408.72421: stdout chunk (state=3): >>><<< 12081 1726882408.72438: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 21:33:28.708148", "end": "2024-09-20 21:33:28.711086", "delta": "0:00:00.002938", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882408.72468: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/num_grat_arp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882408.72473: _low_level_execute_command(): starting 12081 1726882408.72478: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882408.45571-13112-118292124469470/ > /dev/null 2>&1 && sleep 0' 12081 1726882408.73129: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.73137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.73148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.73162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.73203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.73218: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.73228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.73242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.73249: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.73256: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.73265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.73276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.73288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.73296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.73302: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.73311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.73391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.73410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.73424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.73562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.75431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.75435: stdout chunk (state=3): >>><<< 12081 1726882408.75441: stderr chunk (state=3): >>><<< 12081 1726882408.75460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.75463: handler run complete 12081 1726882408.75487: Evaluated conditional (False): False 12081 1726882408.75639: variable 'bond_opt' from source: unknown 12081 1726882408.75645: variable 'result' from source: unknown 12081 1726882408.75660: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882408.75673: attempt loop complete, returning result 12081 1726882408.75692: variable 'bond_opt' from source: unknown 12081 1726882408.75757: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'num_grat_arp', 'value': '64'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "num_grat_arp", "value": "64" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/num_grat_arp" ], "delta": "0:00:00.002938", "end": "2024-09-20 21:33:28.711086", "rc": 0, "start": "2024-09-20 21:33:28.708148" } STDOUT: 64 12081 1726882408.75900: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.75903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.75906: variable 'omit' from source: magic vars 12081 1726882408.76056: variable 'ansible_distribution_major_version' from source: facts 12081 1726882408.76060: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882408.76062: variable 'omit' from source: magic vars 12081 1726882408.76080: variable 'omit' from source: magic vars 12081 1726882408.76235: variable 'controller_device' from source: play vars 12081 1726882408.76238: variable 'bond_opt' from source: unknown 12081 1726882408.76257: variable 'omit' from source: magic vars 12081 1726882408.76279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882408.76286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882408.76293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882408.76306: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882408.76309: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.76312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.76388: Set connection var ansible_pipelining to False 12081 1726882408.76392: Set connection var ansible_shell_type to sh 12081 1726882408.76399: Set connection var ansible_shell_executable to /bin/sh 12081 1726882408.76402: Set connection var ansible_connection to ssh 12081 1726882408.76406: Set connection var ansible_timeout to 10 12081 1726882408.76411: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882408.76434: variable 'ansible_shell_executable' from source: unknown 12081 1726882408.76437: variable 'ansible_connection' from source: unknown 12081 1726882408.76439: variable 'ansible_module_compression' from source: unknown 12081 1726882408.76442: variable 'ansible_shell_type' from source: unknown 12081 1726882408.76444: variable 'ansible_shell_executable' from source: unknown 12081 1726882408.76446: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882408.76455: variable 'ansible_pipelining' from source: unknown 12081 1726882408.76459: variable 'ansible_timeout' from source: unknown 12081 1726882408.76461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882408.76545: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882408.76555: variable 'omit' from source: magic vars 12081 1726882408.76558: starting attempt loop 12081 1726882408.76561: running the handler 12081 1726882408.76567: _low_level_execute_command(): starting 12081 1726882408.76569: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882408.77190: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.77199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.77208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.77221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.77257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.77266: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.77276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.77288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.77295: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.77301: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.77308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.77316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.77326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.77332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.77339: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.77348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.77419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.77436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.77447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.77576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.79126: stdout chunk (state=3): >>>/root <<< 12081 1726882408.79320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.79323: stdout chunk (state=3): >>><<< 12081 1726882408.79325: stderr chunk (state=3): >>><<< 12081 1726882408.79414: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.79418: _low_level_execute_command(): starting 12081 1726882408.79420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150 `" && echo ansible-tmp-1726882408.7934046-13112-114563319121150="` echo /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150 `" ) && sleep 0' 12081 1726882408.80032: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.80048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.80075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.80094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.80136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.80149: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.80170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.80192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.80205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.80217: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.80229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.80243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.80264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.80280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.80295: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.80310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.80389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.80416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.80433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.80573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.82427: stdout chunk (state=3): >>>ansible-tmp-1726882408.7934046-13112-114563319121150=/root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150 <<< 12081 1726882408.82589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.82659: stderr chunk (state=3): >>><<< 12081 1726882408.82663: stdout chunk (state=3): >>><<< 12081 1726882408.82771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882408.7934046-13112-114563319121150=/root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.82779: variable 'ansible_module_compression' from source: unknown 12081 1726882408.82781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882408.82783: variable 'ansible_facts' from source: unknown 12081 1726882408.82879: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150/AnsiballZ_command.py 12081 1726882408.83259: Sending initial data 12081 1726882408.83262: Sent initial data (156 bytes) 12081 1726882408.84235: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.84244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.84258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.84272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.84316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.84323: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.84333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.84346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.84356: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.84359: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.84370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.84378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.84391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.84400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.84407: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.84421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.84491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.84512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.84525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.84664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.86385: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882408.86475: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882408.86580: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp2cdkhbqp /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150/AnsiballZ_command.py <<< 12081 1726882408.86683: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882408.88403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.88484: stderr chunk (state=3): >>><<< 12081 1726882408.88487: stdout chunk (state=3): >>><<< 12081 1726882408.88508: done transferring module to remote 12081 1726882408.88516: _low_level_execute_command(): starting 12081 1726882408.88521: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150/ /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150/AnsiballZ_command.py && sleep 0' 12081 1726882408.89514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.90409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.90419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.90434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.90479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.90487: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.90496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.90510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.90518: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.90524: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.90532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.90541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.90555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.90558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.90568: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.90577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.90647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.90681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.90694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.90822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882408.92625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882408.92629: stdout chunk (state=3): >>><<< 12081 1726882408.92634: stderr chunk (state=3): >>><<< 12081 1726882408.92658: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882408.92669: _low_level_execute_command(): starting 12081 1726882408.92671: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150/AnsiballZ_command.py && sleep 0' 12081 1726882408.93393: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882408.93396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.93399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.93401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.93403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.93419: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882408.93973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.93976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882408.93979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882408.93981: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882408.93983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882408.93985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882408.93986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882408.93988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882408.93990: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882408.93992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882408.93994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882408.93996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882408.93998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882408.94000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882409.07073: stdout chunk (state=3): >>> {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 21:33:29.066394", "end": "2024-09-20 21:33:29.069350", "delta": "0:00:00.002956", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882409.08304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882409.08308: stdout chunk (state=3): >>><<< 12081 1726882409.08311: stderr chunk (state=3): >>><<< 12081 1726882409.08437: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 21:33:29.066394", "end": "2024-09-20 21:33:29.069350", "delta": "0:00:00.002956", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882409.08441: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/resend_igmp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882409.08443: _low_level_execute_command(): starting 12081 1726882409.08445: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882408.7934046-13112-114563319121150/ > /dev/null 2>&1 && sleep 0' 12081 1726882409.09025: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882409.09038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.09051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.09075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.09118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.09129: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882409.09143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.09162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882409.09178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882409.09192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882409.09207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.09221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.09236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.09247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.09258: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882409.09275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.09352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882409.09380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882409.09402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882409.09543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882409.11380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882409.11434: stderr chunk (state=3): >>><<< 12081 1726882409.11437: stdout chunk (state=3): >>><<< 12081 1726882409.11471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882409.11474: handler run complete 12081 1726882409.11571: Evaluated conditional (False): False 12081 1726882409.11679: variable 'bond_opt' from source: unknown 12081 1726882409.11696: variable 'result' from source: unknown 12081 1726882409.11714: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882409.11730: attempt loop complete, returning result 12081 1726882409.11757: variable 'bond_opt' from source: unknown 12081 1726882409.11837: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'resend_igmp', 'value': '225'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "resend_igmp", "value": "225" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/resend_igmp" ], "delta": "0:00:00.002956", "end": "2024-09-20 21:33:29.069350", "rc": 0, "start": "2024-09-20 21:33:29.066394" } STDOUT: 225 12081 1726882409.12074: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882409.12089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882409.12111: variable 'omit' from source: magic vars 12081 1726882409.12279: variable 'ansible_distribution_major_version' from source: facts 12081 1726882409.12290: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882409.12299: variable 'omit' from source: magic vars 12081 1726882409.12320: variable 'omit' from source: magic vars 12081 1726882409.12783: variable 'controller_device' from source: play vars 12081 1726882409.12786: variable 'bond_opt' from source: unknown 12081 1726882409.12806: variable 'omit' from source: magic vars 12081 1726882409.12827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882409.12835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882409.12842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882409.12857: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882409.12860: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882409.12862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882409.12943: Set connection var ansible_pipelining to False 12081 1726882409.12946: Set connection var ansible_shell_type to sh 12081 1726882409.12955: Set connection var ansible_shell_executable to /bin/sh 12081 1726882409.12958: Set connection var ansible_connection to ssh 12081 1726882409.12960: Set connection var ansible_timeout to 10 12081 1726882409.12966: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882409.12989: variable 'ansible_shell_executable' from source: unknown 12081 1726882409.12992: variable 'ansible_connection' from source: unknown 12081 1726882409.12995: variable 'ansible_module_compression' from source: unknown 12081 1726882409.12997: variable 'ansible_shell_type' from source: unknown 12081 1726882409.13000: variable 'ansible_shell_executable' from source: unknown 12081 1726882409.13002: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882409.13004: variable 'ansible_pipelining' from source: unknown 12081 1726882409.13006: variable 'ansible_timeout' from source: unknown 12081 1726882409.13011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882409.13103: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882409.13111: variable 'omit' from source: magic vars 12081 1726882409.13114: starting attempt loop 12081 1726882409.13116: running the handler 12081 1726882409.13123: _low_level_execute_command(): starting 12081 1726882409.13126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882409.13747: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882409.13758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.13767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.13781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.13820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.13827: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882409.13837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.13851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882409.13858: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882409.13866: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882409.13874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.13883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.13895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.13900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.13908: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882409.13917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.13988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882409.14010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882409.14020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882409.14154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882409.15729: stdout chunk (state=3): >>>/root <<< 12081 1726882409.15902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882409.15906: stderr chunk (state=3): >>><<< 12081 1726882409.15908: stdout chunk (state=3): >>><<< 12081 1726882409.15926: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882409.15935: _low_level_execute_command(): starting 12081 1726882409.15941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291 `" && echo ansible-tmp-1726882409.1592643-13112-137927105084291="` echo /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291 `" ) && sleep 0' 12081 1726882409.16565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882409.16575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.16586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.16601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.16639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.16647: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882409.16658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.16676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882409.16685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882409.16690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882409.16698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.16707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.16718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.16726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.16732: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882409.16741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.16823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882409.16842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882409.16857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882409.16998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882409.18856: stdout chunk (state=3): >>>ansible-tmp-1726882409.1592643-13112-137927105084291=/root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291 <<< 12081 1726882409.18969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882409.19072: stderr chunk (state=3): >>><<< 12081 1726882409.19084: stdout chunk (state=3): >>><<< 12081 1726882409.19341: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882409.1592643-13112-137927105084291=/root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882409.19345: variable 'ansible_module_compression' from source: unknown 12081 1726882409.19348: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882409.19350: variable 'ansible_facts' from source: unknown 12081 1726882409.19355: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291/AnsiballZ_command.py 12081 1726882409.19421: Sending initial data 12081 1726882409.19426: Sent initial data (156 bytes) 12081 1726882409.20487: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882409.20500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.20512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.20528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.20585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.20596: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882409.20607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.20621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882409.20630: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882409.20638: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882409.20650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.20673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.20686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.20697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.20706: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882409.20717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.20807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882409.20831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882409.20848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882409.20996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882409.22717: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882409.22818: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882409.22923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp3sitfk69 /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291/AnsiballZ_command.py <<< 12081 1726882409.23018: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882409.24380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882409.24634: stderr chunk (state=3): >>><<< 12081 1726882409.24638: stdout chunk (state=3): >>><<< 12081 1726882409.24640: done transferring module to remote 12081 1726882409.24642: _low_level_execute_command(): starting 12081 1726882409.24645: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291/ /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291/AnsiballZ_command.py && sleep 0' 12081 1726882409.25285: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882409.25308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.25326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.25345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.25396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.25415: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882409.25435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.25457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882409.25473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882409.25485: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882409.25497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.25515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.25539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.25555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.25571: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882409.25584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.25681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882409.25704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882409.25721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882409.25920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882409.27672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882409.27676: stderr chunk (state=3): >>><<< 12081 1726882409.27679: stdout chunk (state=3): >>><<< 12081 1726882409.27699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882409.27703: _low_level_execute_command(): starting 12081 1726882409.27705: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291/AnsiballZ_command.py && sleep 0' 12081 1726882409.28335: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882409.28346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.28349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.28367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.28405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.28413: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882409.28423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.28436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882409.28444: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882409.28458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882409.28461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882409.28466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882409.28482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882409.28489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882409.28496: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882409.28505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882409.28577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882409.28596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882409.28608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882409.28743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.42172: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 21:33:29.416090", "end": "2024-09-20 21:33:30.420107", "delta": "0:00:01.004017", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882410.43583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882410.43588: stdout chunk (state=3): >>><<< 12081 1726882410.43590: stderr chunk (state=3): >>><<< 12081 1726882410.43723: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 21:33:29.416090", "end": "2024-09-20 21:33:30.420107", "delta": "0:00:01.004017", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882410.43727: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/updelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882410.43730: _low_level_execute_command(): starting 12081 1726882410.43732: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882409.1592643-13112-137927105084291/ > /dev/null 2>&1 && sleep 0' 12081 1726882410.44355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.44373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.44387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.44402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.44440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.44451: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.44467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.44484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.44494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.44502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.44511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.44521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.44534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.44543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.44551: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.44567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.44660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.44686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.44702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.44833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.46689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.46783: stderr chunk (state=3): >>><<< 12081 1726882410.46786: stdout chunk (state=3): >>><<< 12081 1726882410.46972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882410.46976: handler run complete 12081 1726882410.46978: Evaluated conditional (False): False 12081 1726882410.47004: variable 'bond_opt' from source: unknown 12081 1726882410.47015: variable 'result' from source: unknown 12081 1726882410.47034: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882410.47055: attempt loop complete, returning result 12081 1726882410.47086: variable 'bond_opt' from source: unknown 12081 1726882410.47148: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'updelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "updelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/updelay" ], "delta": "0:00:01.004017", "end": "2024-09-20 21:33:30.420107", "rc": 0, "start": "2024-09-20 21:33:29.416090" } STDOUT: 0 12081 1726882410.47377: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882410.47385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882410.47394: variable 'omit' from source: magic vars 12081 1726882410.47557: variable 'ansible_distribution_major_version' from source: facts 12081 1726882410.47561: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882410.47564: variable 'omit' from source: magic vars 12081 1726882410.47581: variable 'omit' from source: magic vars 12081 1726882410.47790: variable 'controller_device' from source: play vars 12081 1726882410.47793: variable 'bond_opt' from source: unknown 12081 1726882410.47814: variable 'omit' from source: magic vars 12081 1726882410.47835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882410.47842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882410.47859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882410.47863: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882410.47871: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882410.47873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882410.48255: Set connection var ansible_pipelining to False 12081 1726882410.48258: Set connection var ansible_shell_type to sh 12081 1726882410.48266: Set connection var ansible_shell_executable to /bin/sh 12081 1726882410.48268: Set connection var ansible_connection to ssh 12081 1726882410.48274: Set connection var ansible_timeout to 10 12081 1726882410.48279: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882410.48308: variable 'ansible_shell_executable' from source: unknown 12081 1726882410.48311: variable 'ansible_connection' from source: unknown 12081 1726882410.48313: variable 'ansible_module_compression' from source: unknown 12081 1726882410.48316: variable 'ansible_shell_type' from source: unknown 12081 1726882410.48318: variable 'ansible_shell_executable' from source: unknown 12081 1726882410.48320: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882410.48322: variable 'ansible_pipelining' from source: unknown 12081 1726882410.48326: variable 'ansible_timeout' from source: unknown 12081 1726882410.48330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882410.48432: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882410.48440: variable 'omit' from source: magic vars 12081 1726882410.48443: starting attempt loop 12081 1726882410.48446: running the handler 12081 1726882410.48455: _low_level_execute_command(): starting 12081 1726882410.48458: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882410.50027: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.50044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.50056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.50073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.50111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.50118: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.50127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.50140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.50145: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.50154: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.50159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.50169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.50184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.50190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.50196: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.50205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.50278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.50296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.50305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.50433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.52058: stdout chunk (state=3): >>>/root <<< 12081 1726882410.52160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.52249: stderr chunk (state=3): >>><<< 12081 1726882410.52263: stdout chunk (state=3): >>><<< 12081 1726882410.52376: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882410.52380: _low_level_execute_command(): starting 12081 1726882410.52383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099 `" && echo ansible-tmp-1726882410.5228813-13112-128118546528099="` echo /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099 `" ) && sleep 0' 12081 1726882410.53028: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.53045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.53070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.53090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.53136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.53156: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.53180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.53198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.53211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.53222: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.53234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.53250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.53275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.53288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.53299: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.53313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.53397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.53421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.53437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.53582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.55482: stdout chunk (state=3): >>>ansible-tmp-1726882410.5228813-13112-128118546528099=/root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099 <<< 12081 1726882410.55593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.55693: stderr chunk (state=3): >>><<< 12081 1726882410.55705: stdout chunk (state=3): >>><<< 12081 1726882410.55874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882410.5228813-13112-128118546528099=/root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882410.55878: variable 'ansible_module_compression' from source: unknown 12081 1726882410.56271: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882410.56274: variable 'ansible_facts' from source: unknown 12081 1726882410.56376: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099/AnsiballZ_command.py 12081 1726882410.57170: Sending initial data 12081 1726882410.57173: Sent initial data (156 bytes) 12081 1726882410.58599: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.58607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.58618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.58634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.58679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.58683: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.58694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.58707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.58714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.58721: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.58728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.58737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.58749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.58756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.58761: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.58772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.58842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.58868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.58872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.59014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.60768: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882410.60858: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882410.60963: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpli_30vqb /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099/AnsiballZ_command.py <<< 12081 1726882410.61070: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882410.62566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.62648: stderr chunk (state=3): >>><<< 12081 1726882410.62654: stdout chunk (state=3): >>><<< 12081 1726882410.62680: done transferring module to remote 12081 1726882410.62688: _low_level_execute_command(): starting 12081 1726882410.62693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099/ /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099/AnsiballZ_command.py && sleep 0' 12081 1726882410.63358: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.63366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.63382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.63396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.63436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.63444: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.63459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.63477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.63484: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.63490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.63498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.63507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.63518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.63527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.63532: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.63541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.63617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.63637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.63648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.63781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.65587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.65658: stderr chunk (state=3): >>><<< 12081 1726882410.65662: stdout chunk (state=3): >>><<< 12081 1726882410.65759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882410.65763: _low_level_execute_command(): starting 12081 1726882410.65767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099/AnsiballZ_command.py && sleep 0' 12081 1726882410.66384: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.66398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.66414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.66437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.66486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.66499: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.66512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.66530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.66545: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.66560: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.66576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.66590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.66606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.66618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.66628: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.66645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.66725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.66748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.66773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.67140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.80328: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 21:33:30.798958", "end": "2024-09-20 21:33:30.801859", "delta": "0:00:00.002901", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882410.81586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882410.81590: stdout chunk (state=3): >>><<< 12081 1726882410.81592: stderr chunk (state=3): >>><<< 12081 1726882410.81709: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 21:33:30.798958", "end": "2024-09-20 21:33:30.801859", "delta": "0:00:00.002901", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882410.81717: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/use_carrier', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882410.81719: _low_level_execute_command(): starting 12081 1726882410.81721: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882410.5228813-13112-128118546528099/ > /dev/null 2>&1 && sleep 0' 12081 1726882410.82345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.82349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.82388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882410.82392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.82394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882410.82396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.82457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.82471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.82591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.84420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.84504: stderr chunk (state=3): >>><<< 12081 1726882410.84509: stdout chunk (state=3): >>><<< 12081 1726882410.84529: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882410.84534: handler run complete 12081 1726882410.84556: Evaluated conditional (False): False 12081 1726882410.84697: variable 'bond_opt' from source: unknown 12081 1726882410.84702: variable 'result' from source: unknown 12081 1726882410.84715: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882410.84725: attempt loop complete, returning result 12081 1726882410.84742: variable 'bond_opt' from source: unknown 12081 1726882410.84806: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'use_carrier', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "use_carrier", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/use_carrier" ], "delta": "0:00:00.002901", "end": "2024-09-20 21:33:30.801859", "rc": 0, "start": "2024-09-20 21:33:30.798958" } STDOUT: 1 12081 1726882410.84949: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882410.84956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882410.84959: variable 'omit' from source: magic vars 12081 1726882410.85079: variable 'ansible_distribution_major_version' from source: facts 12081 1726882410.85084: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882410.85088: variable 'omit' from source: magic vars 12081 1726882410.85102: variable 'omit' from source: magic vars 12081 1726882410.85261: variable 'controller_device' from source: play vars 12081 1726882410.85266: variable 'bond_opt' from source: unknown 12081 1726882410.85285: variable 'omit' from source: magic vars 12081 1726882410.85311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882410.85319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882410.85325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882410.85336: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882410.85339: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882410.85341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882410.85419: Set connection var ansible_pipelining to False 12081 1726882410.85422: Set connection var ansible_shell_type to sh 12081 1726882410.85428: Set connection var ansible_shell_executable to /bin/sh 12081 1726882410.85430: Set connection var ansible_connection to ssh 12081 1726882410.85435: Set connection var ansible_timeout to 10 12081 1726882410.85440: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882410.85458: variable 'ansible_shell_executable' from source: unknown 12081 1726882410.85462: variable 'ansible_connection' from source: unknown 12081 1726882410.85466: variable 'ansible_module_compression' from source: unknown 12081 1726882410.85468: variable 'ansible_shell_type' from source: unknown 12081 1726882410.85470: variable 'ansible_shell_executable' from source: unknown 12081 1726882410.85472: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882410.85477: variable 'ansible_pipelining' from source: unknown 12081 1726882410.85479: variable 'ansible_timeout' from source: unknown 12081 1726882410.85483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882410.85573: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882410.85581: variable 'omit' from source: magic vars 12081 1726882410.85584: starting attempt loop 12081 1726882410.85586: running the handler 12081 1726882410.85592: _low_level_execute_command(): starting 12081 1726882410.85595: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882410.86262: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.86273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.86290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.86304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.86343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.86349: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.86359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.86376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.86385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.86398: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.86406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.86418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.86425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.86433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.86440: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.86449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.86526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.86545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.86557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.86697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.88312: stdout chunk (state=3): >>>/root <<< 12081 1726882410.88478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.88481: stdout chunk (state=3): >>><<< 12081 1726882410.88487: stderr chunk (state=3): >>><<< 12081 1726882410.88505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882410.88513: _low_level_execute_command(): starting 12081 1726882410.88519: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618 `" && echo ansible-tmp-1726882410.8850443-13112-265877549184618="` echo /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618 `" ) && sleep 0' 12081 1726882410.89145: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.89157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.89167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.89182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.89218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.89226: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.89236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.89249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.89257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.89265: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.89276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.89285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.89297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.89305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.89311: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.89322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.89393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.89411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.89424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.89554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.91451: stdout chunk (state=3): >>>ansible-tmp-1726882410.8850443-13112-265877549184618=/root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618 <<< 12081 1726882410.91654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.91658: stdout chunk (state=3): >>><<< 12081 1726882410.91661: stderr chunk (state=3): >>><<< 12081 1726882410.91916: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882410.8850443-13112-265877549184618=/root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882410.91924: variable 'ansible_module_compression' from source: unknown 12081 1726882410.91926: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882410.91929: variable 'ansible_facts' from source: unknown 12081 1726882410.91931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618/AnsiballZ_command.py 12081 1726882410.91998: Sending initial data 12081 1726882410.92002: Sent initial data (156 bytes) 12081 1726882410.93057: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.93081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.93096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.93120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.93167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.93181: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.93196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.93214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.93230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.93241: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.93255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.93271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.93289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.93318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.93331: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.93348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.93428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.93458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.93480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.93614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882410.95395: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882410.95501: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882410.95600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpfdjahp0i /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618/AnsiballZ_command.py <<< 12081 1726882410.95696: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882410.97287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882410.97463: stderr chunk (state=3): >>><<< 12081 1726882410.97469: stdout chunk (state=3): >>><<< 12081 1726882410.97490: done transferring module to remote 12081 1726882410.97498: _low_level_execute_command(): starting 12081 1726882410.97503: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618/ /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618/AnsiballZ_command.py && sleep 0' 12081 1726882410.99171: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882410.99177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.99189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.99203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.99242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.99248: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882410.99258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.99280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882410.99283: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882410.99286: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882410.99291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882410.99300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882410.99312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882410.99319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882410.99326: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882410.99336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882410.99410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882410.99430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882410.99445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882410.99572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.01429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882411.01433: stdout chunk (state=3): >>><<< 12081 1726882411.01438: stderr chunk (state=3): >>><<< 12081 1726882411.01470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882411.01478: _low_level_execute_command(): starting 12081 1726882411.01481: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618/AnsiballZ_command.py && sleep 0' 12081 1726882411.03125: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882411.03132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.03142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.03157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.03199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.03206: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882411.03215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.03228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882411.03238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882411.03245: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882411.03881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.03892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.03905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.03912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.03918: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882411.03928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.04004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.04024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.04036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.04176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.17628: stdout chunk (state=3): >>> {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 21:33:31.171748", "end": "2024-09-20 21:33:31.174871", "delta": "0:00:00.003123", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882411.18784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882411.18864: stderr chunk (state=3): >>><<< 12081 1726882411.18868: stdout chunk (state=3): >>><<< 12081 1726882411.18888: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 21:33:31.171748", "end": "2024-09-20 21:33:31.174871", "delta": "0:00:00.003123", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882411.18917: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/xmit_hash_policy', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882411.18923: _low_level_execute_command(): starting 12081 1726882411.18929: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882410.8850443-13112-265877549184618/ > /dev/null 2>&1 && sleep 0' 12081 1726882411.21813: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.21817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.21858: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882411.21862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.21878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882411.21890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882411.21901: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882411.21912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.21925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.21941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.21956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.21970: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882411.21984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.22060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.22359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.22377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.22519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.24440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882411.24443: stdout chunk (state=3): >>><<< 12081 1726882411.24446: stderr chunk (state=3): >>><<< 12081 1726882411.24771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882411.24775: handler run complete 12081 1726882411.24777: Evaluated conditional (False): False 12081 1726882411.24780: variable 'bond_opt' from source: unknown 12081 1726882411.24782: variable 'result' from source: unknown 12081 1726882411.24784: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882411.24786: attempt loop complete, returning result 12081 1726882411.24788: variable 'bond_opt' from source: unknown 12081 1726882411.24790: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'xmit_hash_policy', 'value': 'encap2+3'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "xmit_hash_policy", "value": "encap2+3" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy" ], "delta": "0:00:00.003123", "end": "2024-09-20 21:33:31.174871", "rc": 0, "start": "2024-09-20 21:33:31.171748" } STDOUT: encap2+3 3 12081 1726882411.24949: dumping result to json 12081 1726882411.25004: done dumping result, returning 12081 1726882411.25018: done running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings [0e448fcc-3ce9-0a3f-ff3c-000000000400] 12081 1726882411.25029: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000400 12081 1726882411.25743: no more pending results, returning what we have 12081 1726882411.25747: results queue empty 12081 1726882411.25748: checking for any_errors_fatal 12081 1726882411.25752: done checking for any_errors_fatal 12081 1726882411.25753: checking for max_fail_percentage 12081 1726882411.25755: done checking for max_fail_percentage 12081 1726882411.25756: checking to see if all hosts have failed and the running result is not ok 12081 1726882411.25757: done checking to see if all hosts have failed 12081 1726882411.25757: getting the remaining hosts for this loop 12081 1726882411.25759: done getting the remaining hosts for this loop 12081 1726882411.25763: getting the next task for host managed_node3 12081 1726882411.25772: done getting next task for host managed_node3 12081 1726882411.25774: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 12081 1726882411.25777: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882411.25781: getting variables 12081 1726882411.25782: in VariableManager get_vars() 12081 1726882411.25809: Calling all_inventory to load vars for managed_node3 12081 1726882411.25812: Calling groups_inventory to load vars for managed_node3 12081 1726882411.25816: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882411.25829: Calling all_plugins_play to load vars for managed_node3 12081 1726882411.25832: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882411.25835: Calling groups_plugins_play to load vars for managed_node3 12081 1726882411.26357: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000400 12081 1726882411.26361: WORKER PROCESS EXITING 12081 1726882411.27936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882411.31842: done with get_vars() 12081 1726882411.31873: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 21:33:31 -0400 (0:00:07.809) 0:00:31.122 ****** 12081 1726882411.31960: entering _queue_task() for managed_node3/include_tasks 12081 1726882411.32283: worker is 1 (out of 1 available) 12081 1726882411.32297: exiting _queue_task() for managed_node3/include_tasks 12081 1726882411.32310: done queuing things up, now waiting for results queue to drain 12081 1726882411.32311: waiting for pending results... 12081 1726882411.33264: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' 12081 1726882411.33576: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000402 12081 1726882411.33632: variable 'ansible_search_path' from source: unknown 12081 1726882411.33703: variable 'ansible_search_path' from source: unknown 12081 1726882411.33751: calling self._execute() 12081 1726882411.33984: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882411.33995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882411.34055: variable 'omit' from source: magic vars 12081 1726882411.35643: variable 'ansible_distribution_major_version' from source: facts 12081 1726882411.35960: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882411.35977: _execute() done 12081 1726882411.35985: dumping result to json 12081 1726882411.36002: done dumping result, returning 12081 1726882411.36013: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000402] 12081 1726882411.36079: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000402 12081 1726882411.36229: no more pending results, returning what we have 12081 1726882411.36236: in VariableManager get_vars() 12081 1726882411.36279: Calling all_inventory to load vars for managed_node3 12081 1726882411.36282: Calling groups_inventory to load vars for managed_node3 12081 1726882411.36287: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882411.36304: Calling all_plugins_play to load vars for managed_node3 12081 1726882411.36306: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882411.36310: Calling groups_plugins_play to load vars for managed_node3 12081 1726882411.36829: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000402 12081 1726882411.36833: WORKER PROCESS EXITING 12081 1726882411.39050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882411.42133: done with get_vars() 12081 1726882411.42165: variable 'ansible_search_path' from source: unknown 12081 1726882411.42167: variable 'ansible_search_path' from source: unknown 12081 1726882411.42178: variable 'item' from source: include params 12081 1726882411.42282: variable 'item' from source: include params 12081 1726882411.42317: we have included files to process 12081 1726882411.42318: generating all_blocks data 12081 1726882411.42320: done generating all_blocks data 12081 1726882411.42325: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12081 1726882411.42327: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12081 1726882411.42329: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12081 1726882411.42692: done processing included file 12081 1726882411.42694: iterating over new_blocks loaded from include file 12081 1726882411.42696: in VariableManager get_vars() 12081 1726882411.42715: done with get_vars() 12081 1726882411.42717: filtering new block on tags 12081 1726882411.42745: done filtering new block on tags 12081 1726882411.42748: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node3 12081 1726882411.42753: extending task lists for all hosts with included blocks 12081 1726882411.42982: done extending task lists 12081 1726882411.42984: done processing included files 12081 1726882411.42985: results queue empty 12081 1726882411.42986: checking for any_errors_fatal 12081 1726882411.43002: done checking for any_errors_fatal 12081 1726882411.43003: checking for max_fail_percentage 12081 1726882411.43005: done checking for max_fail_percentage 12081 1726882411.43005: checking to see if all hosts have failed and the running result is not ok 12081 1726882411.43006: done checking to see if all hosts have failed 12081 1726882411.43007: getting the remaining hosts for this loop 12081 1726882411.43009: done getting the remaining hosts for this loop 12081 1726882411.43011: getting the next task for host managed_node3 12081 1726882411.43016: done getting next task for host managed_node3 12081 1726882411.43018: ^ task is: TASK: ** TEST check IPv4 12081 1726882411.43021: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882411.43023: getting variables 12081 1726882411.43024: in VariableManager get_vars() 12081 1726882411.43034: Calling all_inventory to load vars for managed_node3 12081 1726882411.43036: Calling groups_inventory to load vars for managed_node3 12081 1726882411.43038: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882411.43044: Calling all_plugins_play to load vars for managed_node3 12081 1726882411.43047: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882411.43050: Calling groups_plugins_play to load vars for managed_node3 12081 1726882411.45792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882411.49651: done with get_vars() 12081 1726882411.49690: done getting variables 12081 1726882411.49740: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 21:33:31 -0400 (0:00:00.178) 0:00:31.300 ****** 12081 1726882411.49776: entering _queue_task() for managed_node3/command 12081 1726882411.50127: worker is 1 (out of 1 available) 12081 1726882411.50139: exiting _queue_task() for managed_node3/command 12081 1726882411.50153: done queuing things up, now waiting for results queue to drain 12081 1726882411.50154: waiting for pending results... 12081 1726882411.51260: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 12081 1726882411.51584: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000631 12081 1726882411.51609: variable 'ansible_search_path' from source: unknown 12081 1726882411.51711: variable 'ansible_search_path' from source: unknown 12081 1726882411.51759: calling self._execute() 12081 1726882411.51976: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882411.51989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882411.52001: variable 'omit' from source: magic vars 12081 1726882411.52844: variable 'ansible_distribution_major_version' from source: facts 12081 1726882411.52871: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882411.52882: variable 'omit' from source: magic vars 12081 1726882411.52944: variable 'omit' from source: magic vars 12081 1726882411.53288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882411.58272: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882411.58363: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882411.58409: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882411.58455: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882411.58489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882411.58587: variable 'interface' from source: include params 12081 1726882411.58598: variable 'controller_device' from source: play vars 12081 1726882411.58677: variable 'controller_device' from source: play vars 12081 1726882411.58707: variable 'omit' from source: magic vars 12081 1726882411.58745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882411.58784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882411.58805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882411.58823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882411.58836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882411.58875: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882411.58887: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882411.58896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882411.59006: Set connection var ansible_pipelining to False 12081 1726882411.59013: Set connection var ansible_shell_type to sh 12081 1726882411.59023: Set connection var ansible_shell_executable to /bin/sh 12081 1726882411.59029: Set connection var ansible_connection to ssh 12081 1726882411.59038: Set connection var ansible_timeout to 10 12081 1726882411.59045: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882411.59080: variable 'ansible_shell_executable' from source: unknown 12081 1726882411.59088: variable 'ansible_connection' from source: unknown 12081 1726882411.59100: variable 'ansible_module_compression' from source: unknown 12081 1726882411.59107: variable 'ansible_shell_type' from source: unknown 12081 1726882411.59113: variable 'ansible_shell_executable' from source: unknown 12081 1726882411.59118: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882411.59125: variable 'ansible_pipelining' from source: unknown 12081 1726882411.59131: variable 'ansible_timeout' from source: unknown 12081 1726882411.59138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882411.59250: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882411.59272: variable 'omit' from source: magic vars 12081 1726882411.59281: starting attempt loop 12081 1726882411.59287: running the handler 12081 1726882411.59310: _low_level_execute_command(): starting 12081 1726882411.59325: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882411.60104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882411.60118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.60131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.60148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.60200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.60213: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882411.60228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.60246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882411.60266: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882411.60281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882411.60297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.60311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.60328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.60339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.60349: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882411.60369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.60447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.60477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.60498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.60640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.62345: stdout chunk (state=3): >>>/root <<< 12081 1726882411.62515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882411.62521: stdout chunk (state=3): >>><<< 12081 1726882411.62530: stderr chunk (state=3): >>><<< 12081 1726882411.62557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882411.62570: _low_level_execute_command(): starting 12081 1726882411.62573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231 `" && echo ansible-tmp-1726882411.6255374-13544-142607693147231="` echo /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231 `" ) && sleep 0' 12081 1726882411.65260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.65268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.65313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.65317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.65331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.65337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.65429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.65435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.65455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.65645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.67520: stdout chunk (state=3): >>>ansible-tmp-1726882411.6255374-13544-142607693147231=/root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231 <<< 12081 1726882411.67694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882411.67698: stderr chunk (state=3): >>><<< 12081 1726882411.67702: stdout chunk (state=3): >>><<< 12081 1726882411.67726: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882411.6255374-13544-142607693147231=/root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882411.67759: variable 'ansible_module_compression' from source: unknown 12081 1726882411.67816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882411.67848: variable 'ansible_facts' from source: unknown 12081 1726882411.67940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231/AnsiballZ_command.py 12081 1726882411.68414: Sending initial data 12081 1726882411.68418: Sent initial data (156 bytes) 12081 1726882411.71171: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882411.71175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.71177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.71179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.71181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.71183: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882411.71185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.71191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882411.71193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882411.71195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882411.71196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.71198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.71201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.71202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.71204: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882411.71206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.71244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.71319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.71331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.71462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.73241: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882411.73339: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882411.73444: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpk9x85gx9 /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231/AnsiballZ_command.py <<< 12081 1726882411.73544: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882411.75234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882411.75238: stderr chunk (state=3): >>><<< 12081 1726882411.75243: stdout chunk (state=3): >>><<< 12081 1726882411.75267: done transferring module to remote 12081 1726882411.75279: _low_level_execute_command(): starting 12081 1726882411.75284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231/ /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231/AnsiballZ_command.py && sleep 0' 12081 1726882411.76767: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882411.76896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.76906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.76920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.76958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.76967: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882411.76977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.76992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882411.77001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882411.77009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882411.77018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.77031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.77042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.77050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882411.77056: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882411.77071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.77269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.77272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.77278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.77405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.79259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882411.79262: stdout chunk (state=3): >>><<< 12081 1726882411.79270: stderr chunk (state=3): >>><<< 12081 1726882411.79288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882411.79291: _low_level_execute_command(): starting 12081 1726882411.79296: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231/AnsiballZ_command.py && sleep 0' 12081 1726882411.80717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.80721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.80913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.80917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.80931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.80937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.81017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.81108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.81113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.81255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882411.94833: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.214/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:31.943204", "end": "2024-09-20 21:33:31.946765", "delta": "0:00:00.003561", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882411.95998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882411.96082: stderr chunk (state=3): >>><<< 12081 1726882411.96086: stdout chunk (state=3): >>><<< 12081 1726882411.96229: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.214/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:31.943204", "end": "2024-09-20 21:33:31.946765", "delta": "0:00:00.003561", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882411.96239: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882411.96241: _low_level_execute_command(): starting 12081 1726882411.96244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882411.6255374-13544-142607693147231/ > /dev/null 2>&1 && sleep 0' 12081 1726882411.98633: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.98707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882411.98711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882411.98753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882411.98757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882411.98760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882411.98762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882411.98819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882411.99217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882411.99282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882411.99405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882412.01257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882412.01345: stderr chunk (state=3): >>><<< 12081 1726882412.01349: stdout chunk (state=3): >>><<< 12081 1726882412.01671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882412.01675: handler run complete 12081 1726882412.01678: Evaluated conditional (False): False 12081 1726882412.01680: variable 'address' from source: include params 12081 1726882412.01682: variable 'result' from source: set_fact 12081 1726882412.01684: Evaluated conditional (address in result.stdout): True 12081 1726882412.01686: attempt loop complete, returning result 12081 1726882412.01688: _execute() done 12081 1726882412.01689: dumping result to json 12081 1726882412.01691: done dumping result, returning 12081 1726882412.01693: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0e448fcc-3ce9-0a3f-ff3c-000000000631] 12081 1726882412.01695: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000631 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003561", "end": "2024-09-20 21:33:31.946765", "rc": 0, "start": "2024-09-20 21:33:31.943204" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.214/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 228sec preferred_lft 228sec 12081 1726882412.01867: no more pending results, returning what we have 12081 1726882412.01872: results queue empty 12081 1726882412.01873: checking for any_errors_fatal 12081 1726882412.01874: done checking for any_errors_fatal 12081 1726882412.01875: checking for max_fail_percentage 12081 1726882412.01877: done checking for max_fail_percentage 12081 1726882412.01878: checking to see if all hosts have failed and the running result is not ok 12081 1726882412.01880: done checking to see if all hosts have failed 12081 1726882412.01880: getting the remaining hosts for this loop 12081 1726882412.01882: done getting the remaining hosts for this loop 12081 1726882412.01886: getting the next task for host managed_node3 12081 1726882412.01896: done getting next task for host managed_node3 12081 1726882412.01899: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 12081 1726882412.01903: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882412.01909: getting variables 12081 1726882412.01911: in VariableManager get_vars() 12081 1726882412.01943: Calling all_inventory to load vars for managed_node3 12081 1726882412.01945: Calling groups_inventory to load vars for managed_node3 12081 1726882412.01949: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.01961: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.01965: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.01968: Calling groups_plugins_play to load vars for managed_node3 12081 1726882412.03290: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000631 12081 1726882412.03293: WORKER PROCESS EXITING 12081 1726882412.04879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882412.08050: done with get_vars() 12081 1726882412.08087: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 21:33:32 -0400 (0:00:00.584) 0:00:31.884 ****** 12081 1726882412.08180: entering _queue_task() for managed_node3/include_tasks 12081 1726882412.09214: worker is 1 (out of 1 available) 12081 1726882412.09226: exiting _queue_task() for managed_node3/include_tasks 12081 1726882412.09239: done queuing things up, now waiting for results queue to drain 12081 1726882412.09240: waiting for pending results... 12081 1726882412.11148: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' 12081 1726882412.11247: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000403 12081 1726882412.11267: variable 'ansible_search_path' from source: unknown 12081 1726882412.11271: variable 'ansible_search_path' from source: unknown 12081 1726882412.11451: calling self._execute() 12081 1726882412.11558: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.11572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.11587: variable 'omit' from source: magic vars 12081 1726882412.11939: variable 'ansible_distribution_major_version' from source: facts 12081 1726882412.11961: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882412.11980: _execute() done 12081 1726882412.11990: dumping result to json 12081 1726882412.11997: done dumping result, returning 12081 1726882412.12008: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000403] 12081 1726882412.12023: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000403 12081 1726882412.12837: no more pending results, returning what we have 12081 1726882412.12843: in VariableManager get_vars() 12081 1726882412.12890: Calling all_inventory to load vars for managed_node3 12081 1726882412.12893: Calling groups_inventory to load vars for managed_node3 12081 1726882412.12897: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.12914: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.12917: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.12920: Calling groups_plugins_play to load vars for managed_node3 12081 1726882412.14009: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000403 12081 1726882412.14014: WORKER PROCESS EXITING 12081 1726882412.14933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882412.17176: done with get_vars() 12081 1726882412.17207: variable 'ansible_search_path' from source: unknown 12081 1726882412.17209: variable 'ansible_search_path' from source: unknown 12081 1726882412.17334: variable 'item' from source: include params 12081 1726882412.17555: variable 'item' from source: include params 12081 1726882412.17593: we have included files to process 12081 1726882412.17594: generating all_blocks data 12081 1726882412.17596: done generating all_blocks data 12081 1726882412.17601: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12081 1726882412.17603: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12081 1726882412.17605: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12081 1726882412.17947: done processing included file 12081 1726882412.17949: iterating over new_blocks loaded from include file 12081 1726882412.17950: in VariableManager get_vars() 12081 1726882412.17970: done with get_vars() 12081 1726882412.17972: filtering new block on tags 12081 1726882412.18014: done filtering new block on tags 12081 1726882412.18017: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node3 12081 1726882412.18023: extending task lists for all hosts with included blocks 12081 1726882412.18420: done extending task lists 12081 1726882412.18421: done processing included files 12081 1726882412.18422: results queue empty 12081 1726882412.18423: checking for any_errors_fatal 12081 1726882412.18428: done checking for any_errors_fatal 12081 1726882412.18428: checking for max_fail_percentage 12081 1726882412.18430: done checking for max_fail_percentage 12081 1726882412.18430: checking to see if all hosts have failed and the running result is not ok 12081 1726882412.18431: done checking to see if all hosts have failed 12081 1726882412.18432: getting the remaining hosts for this loop 12081 1726882412.18434: done getting the remaining hosts for this loop 12081 1726882412.18436: getting the next task for host managed_node3 12081 1726882412.18440: done getting next task for host managed_node3 12081 1726882412.18443: ^ task is: TASK: ** TEST check IPv6 12081 1726882412.18446: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882412.18448: getting variables 12081 1726882412.18449: in VariableManager get_vars() 12081 1726882412.18459: Calling all_inventory to load vars for managed_node3 12081 1726882412.18461: Calling groups_inventory to load vars for managed_node3 12081 1726882412.18465: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.18471: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.18473: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.18476: Calling groups_plugins_play to load vars for managed_node3 12081 1726882412.19969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882412.27407: done with get_vars() 12081 1726882412.27440: done getting variables 12081 1726882412.27492: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 21:33:32 -0400 (0:00:00.193) 0:00:32.078 ****** 12081 1726882412.27526: entering _queue_task() for managed_node3/command 12081 1726882412.27872: worker is 1 (out of 1 available) 12081 1726882412.27885: exiting _queue_task() for managed_node3/command 12081 1726882412.27899: done queuing things up, now waiting for results queue to drain 12081 1726882412.27901: waiting for pending results... 12081 1726882412.28220: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 12081 1726882412.28371: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000652 12081 1726882412.28397: variable 'ansible_search_path' from source: unknown 12081 1726882412.28409: variable 'ansible_search_path' from source: unknown 12081 1726882412.28459: calling self._execute() 12081 1726882412.28562: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.28576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.28592: variable 'omit' from source: magic vars 12081 1726882412.29011: variable 'ansible_distribution_major_version' from source: facts 12081 1726882412.29030: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882412.29043: variable 'omit' from source: magic vars 12081 1726882412.29113: variable 'omit' from source: magic vars 12081 1726882412.29299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882412.31754: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882412.31846: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882412.31897: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882412.31943: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882412.31977: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882412.32071: variable 'controller_device' from source: play vars 12081 1726882412.32106: variable 'omit' from source: magic vars 12081 1726882412.32150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882412.32185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882412.32214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882412.32237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882412.32254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882412.32295: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882412.32304: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.32317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.32427: Set connection var ansible_pipelining to False 12081 1726882412.32434: Set connection var ansible_shell_type to sh 12081 1726882412.32445: Set connection var ansible_shell_executable to /bin/sh 12081 1726882412.32450: Set connection var ansible_connection to ssh 12081 1726882412.32458: Set connection var ansible_timeout to 10 12081 1726882412.32468: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882412.32502: variable 'ansible_shell_executable' from source: unknown 12081 1726882412.32510: variable 'ansible_connection' from source: unknown 12081 1726882412.32516: variable 'ansible_module_compression' from source: unknown 12081 1726882412.32523: variable 'ansible_shell_type' from source: unknown 12081 1726882412.32534: variable 'ansible_shell_executable' from source: unknown 12081 1726882412.32541: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.32548: variable 'ansible_pipelining' from source: unknown 12081 1726882412.32554: variable 'ansible_timeout' from source: unknown 12081 1726882412.32561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.32685: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882412.32705: variable 'omit' from source: magic vars 12081 1726882412.32714: starting attempt loop 12081 1726882412.32719: running the handler 12081 1726882412.32736: _low_level_execute_command(): starting 12081 1726882412.32748: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882412.33515: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882412.33529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.33543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.33559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.33607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.33624: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882412.33638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.33657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882412.33671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882412.33685: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882412.33696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.33707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.33721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.33734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.33744: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882412.33756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.33839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882412.33860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882412.33877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882412.34076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882412.35729: stdout chunk (state=3): >>>/root <<< 12081 1726882412.35927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882412.35931: stdout chunk (state=3): >>><<< 12081 1726882412.35934: stderr chunk (state=3): >>><<< 12081 1726882412.36062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882412.36077: _low_level_execute_command(): starting 12081 1726882412.36081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901 `" && echo ansible-tmp-1726882412.3595922-13569-222606565930901="` echo /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901 `" ) && sleep 0' 12081 1726882412.37412: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.37417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.37578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882412.37582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882412.37585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.37588: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.37759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882412.37775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882412.37903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882412.39790: stdout chunk (state=3): >>>ansible-tmp-1726882412.3595922-13569-222606565930901=/root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901 <<< 12081 1726882412.39906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882412.39988: stderr chunk (state=3): >>><<< 12081 1726882412.39992: stdout chunk (state=3): >>><<< 12081 1726882412.40273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882412.3595922-13569-222606565930901=/root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882412.40276: variable 'ansible_module_compression' from source: unknown 12081 1726882412.40278: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882412.40280: variable 'ansible_facts' from source: unknown 12081 1726882412.40283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901/AnsiballZ_command.py 12081 1726882412.40825: Sending initial data 12081 1726882412.40828: Sent initial data (156 bytes) 12081 1726882412.43399: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882412.43459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.43479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.43497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.43538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.43554: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882412.43573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.43592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882412.43605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882412.43617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882412.43629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.43642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.43657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.43672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.43685: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882412.43697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.43770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882412.43792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882412.43810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882412.43947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882412.45712: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882412.45816: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882412.45919: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpc640wv53 /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901/AnsiballZ_command.py <<< 12081 1726882412.46015: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882412.47592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882412.47773: stderr chunk (state=3): >>><<< 12081 1726882412.47778: stdout chunk (state=3): >>><<< 12081 1726882412.47780: done transferring module to remote 12081 1726882412.47783: _low_level_execute_command(): starting 12081 1726882412.47785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901/ /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901/AnsiballZ_command.py && sleep 0' 12081 1726882412.49216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882412.49223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.49234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.49248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.49293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.49578: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882412.49588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.49601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882412.49608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882412.49615: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882412.49624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.49631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.49642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.49650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.49735: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882412.49739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.50072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882412.50075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882412.50077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882412.50296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882412.52115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882412.52120: stdout chunk (state=3): >>><<< 12081 1726882412.52125: stderr chunk (state=3): >>><<< 12081 1726882412.52143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882412.52147: _low_level_execute_command(): starting 12081 1726882412.52150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901/AnsiballZ_command.py && sleep 0' 12081 1726882412.53881: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.53886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.54060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.54066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.54081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882412.54084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.54174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882412.54243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882412.54246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882412.54392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882412.67736: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::a2/128 scope global dynamic noprefixroute \n valid_lft 228sec preferred_lft 228sec\n inet6 2001:db8::42b0:e712:4533:ebe/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::e214:6431:df72:31ee/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:32.672703", "end": "2024-09-20 21:33:32.675995", "delta": "0:00:00.003292", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882412.68862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882412.68916: stderr chunk (state=3): >>><<< 12081 1726882412.68920: stdout chunk (state=3): >>><<< 12081 1726882412.68935: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::a2/128 scope global dynamic noprefixroute \n valid_lft 228sec preferred_lft 228sec\n inet6 2001:db8::42b0:e712:4533:ebe/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::e214:6431:df72:31ee/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:32.672703", "end": "2024-09-20 21:33:32.675995", "delta": "0:00:00.003292", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882412.68972: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882412.68981: _low_level_execute_command(): starting 12081 1726882412.68984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882412.3595922-13569-222606565930901/ > /dev/null 2>&1 && sleep 0' 12081 1726882412.69566: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882412.69579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.69591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.69605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.69649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.69659: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882412.69673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.69687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882412.69696: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882412.69704: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882412.69713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882412.69723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882412.69739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882412.69749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882412.69758: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882412.69771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882412.69837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882412.69865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882412.69879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882412.70005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882412.71813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882412.71864: stderr chunk (state=3): >>><<< 12081 1726882412.71868: stdout chunk (state=3): >>><<< 12081 1726882412.71883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882412.71892: handler run complete 12081 1726882412.71910: Evaluated conditional (False): False 12081 1726882412.72021: variable 'address' from source: include params 12081 1726882412.72024: variable 'result' from source: set_fact 12081 1726882412.72046: Evaluated conditional (address in result.stdout): True 12081 1726882412.72076: attempt loop complete, returning result 12081 1726882412.72079: _execute() done 12081 1726882412.72082: dumping result to json 12081 1726882412.72084: done dumping result, returning 12081 1726882412.72086: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0e448fcc-3ce9-0a3f-ff3c-000000000652] 12081 1726882412.72088: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000652 12081 1726882412.72185: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000652 12081 1726882412.72187: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003292", "end": "2024-09-20 21:33:32.675995", "rc": 0, "start": "2024-09-20 21:33:32.672703" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::a2/128 scope global dynamic noprefixroute valid_lft 228sec preferred_lft 228sec inet6 2001:db8::42b0:e712:4533:ebe/64 scope global dynamic noprefixroute valid_lft 1798sec preferred_lft 1798sec inet6 fe80::e214:6431:df72:31ee/64 scope link noprefixroute valid_lft forever preferred_lft forever 12081 1726882412.72290: no more pending results, returning what we have 12081 1726882412.72294: results queue empty 12081 1726882412.72294: checking for any_errors_fatal 12081 1726882412.72296: done checking for any_errors_fatal 12081 1726882412.72296: checking for max_fail_percentage 12081 1726882412.72298: done checking for max_fail_percentage 12081 1726882412.72299: checking to see if all hosts have failed and the running result is not ok 12081 1726882412.72300: done checking to see if all hosts have failed 12081 1726882412.72301: getting the remaining hosts for this loop 12081 1726882412.72302: done getting the remaining hosts for this loop 12081 1726882412.72306: getting the next task for host managed_node3 12081 1726882412.72315: done getting next task for host managed_node3 12081 1726882412.72318: ^ task is: TASK: Conditional asserts 12081 1726882412.72320: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882412.72324: getting variables 12081 1726882412.72326: in VariableManager get_vars() 12081 1726882412.72353: Calling all_inventory to load vars for managed_node3 12081 1726882412.72355: Calling groups_inventory to load vars for managed_node3 12081 1726882412.72358: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.72375: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.72377: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.72380: Calling groups_plugins_play to load vars for managed_node3 12081 1726882412.73912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882412.75123: done with get_vars() 12081 1726882412.75145: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:33:32 -0400 (0:00:00.476) 0:00:32.555 ****** 12081 1726882412.75219: entering _queue_task() for managed_node3/include_tasks 12081 1726882412.75461: worker is 1 (out of 1 available) 12081 1726882412.75476: exiting _queue_task() for managed_node3/include_tasks 12081 1726882412.75489: done queuing things up, now waiting for results queue to drain 12081 1726882412.75490: waiting for pending results... 12081 1726882412.75675: running TaskExecutor() for managed_node3/TASK: Conditional asserts 12081 1726882412.75744: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000008e 12081 1726882412.75758: variable 'ansible_search_path' from source: unknown 12081 1726882412.75761: variable 'ansible_search_path' from source: unknown 12081 1726882412.75979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882412.79943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882412.80019: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882412.80062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882412.80103: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882412.80134: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882412.80221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882412.80258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882412.80291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882412.80336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882412.80359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882412.80515: dumping result to json 12081 1726882412.80523: done dumping result, returning 12081 1726882412.80533: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0e448fcc-3ce9-0a3f-ff3c-00000000008e] 12081 1726882412.80544: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008e skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 12081 1726882412.80738: no more pending results, returning what we have 12081 1726882412.80742: results queue empty 12081 1726882412.80743: checking for any_errors_fatal 12081 1726882412.80755: done checking for any_errors_fatal 12081 1726882412.80756: checking for max_fail_percentage 12081 1726882412.80758: done checking for max_fail_percentage 12081 1726882412.80759: checking to see if all hosts have failed and the running result is not ok 12081 1726882412.80760: done checking to see if all hosts have failed 12081 1726882412.80761: getting the remaining hosts for this loop 12081 1726882412.80762: done getting the remaining hosts for this loop 12081 1726882412.80771: getting the next task for host managed_node3 12081 1726882412.80778: done getting next task for host managed_node3 12081 1726882412.80780: ^ task is: TASK: Success in test '{{ lsr_description }}' 12081 1726882412.80783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882412.80787: getting variables 12081 1726882412.80789: in VariableManager get_vars() 12081 1726882412.80819: Calling all_inventory to load vars for managed_node3 12081 1726882412.80822: Calling groups_inventory to load vars for managed_node3 12081 1726882412.80825: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.80838: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.80841: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.80844: Calling groups_plugins_play to load vars for managed_node3 12081 1726882412.81366: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008e 12081 1726882412.81371: WORKER PROCESS EXITING 12081 1726882412.81868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882412.83137: done with get_vars() 12081 1726882412.83169: done getting variables 12081 1726882412.83230: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882412.83364: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:33:32 -0400 (0:00:00.081) 0:00:32.636 ****** 12081 1726882412.83398: entering _queue_task() for managed_node3/debug 12081 1726882412.83734: worker is 1 (out of 1 available) 12081 1726882412.83748: exiting _queue_task() for managed_node3/debug 12081 1726882412.83766: done queuing things up, now waiting for results queue to drain 12081 1726882412.83768: waiting for pending results... 12081 1726882412.83982: running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 12081 1726882412.84060: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000008f 12081 1726882412.84072: variable 'ansible_search_path' from source: unknown 12081 1726882412.84076: variable 'ansible_search_path' from source: unknown 12081 1726882412.84107: calling self._execute() 12081 1726882412.84179: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.84183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.84192: variable 'omit' from source: magic vars 12081 1726882412.84467: variable 'ansible_distribution_major_version' from source: facts 12081 1726882412.84478: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882412.84484: variable 'omit' from source: magic vars 12081 1726882412.84512: variable 'omit' from source: magic vars 12081 1726882412.84582: variable 'lsr_description' from source: include params 12081 1726882412.84596: variable 'omit' from source: magic vars 12081 1726882412.84630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882412.84657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882412.84674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882412.84686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882412.84696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882412.84719: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882412.84723: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.84725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.84796: Set connection var ansible_pipelining to False 12081 1726882412.84800: Set connection var ansible_shell_type to sh 12081 1726882412.84805: Set connection var ansible_shell_executable to /bin/sh 12081 1726882412.84807: Set connection var ansible_connection to ssh 12081 1726882412.84813: Set connection var ansible_timeout to 10 12081 1726882412.84817: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882412.84835: variable 'ansible_shell_executable' from source: unknown 12081 1726882412.84839: variable 'ansible_connection' from source: unknown 12081 1726882412.84841: variable 'ansible_module_compression' from source: unknown 12081 1726882412.84843: variable 'ansible_shell_type' from source: unknown 12081 1726882412.84846: variable 'ansible_shell_executable' from source: unknown 12081 1726882412.84849: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.84854: variable 'ansible_pipelining' from source: unknown 12081 1726882412.84859: variable 'ansible_timeout' from source: unknown 12081 1726882412.84862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.84957: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882412.84967: variable 'omit' from source: magic vars 12081 1726882412.84972: starting attempt loop 12081 1726882412.84976: running the handler 12081 1726882412.85013: handler run complete 12081 1726882412.85023: attempt loop complete, returning result 12081 1726882412.85027: _execute() done 12081 1726882412.85030: dumping result to json 12081 1726882412.85033: done dumping result, returning 12081 1726882412.85040: done running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0e448fcc-3ce9-0a3f-ff3c-00000000008f] 12081 1726882412.85046: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008f 12081 1726882412.85135: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000008f 12081 1726882412.85137: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 12081 1726882412.85191: no more pending results, returning what we have 12081 1726882412.85197: results queue empty 12081 1726882412.85198: checking for any_errors_fatal 12081 1726882412.85204: done checking for any_errors_fatal 12081 1726882412.85205: checking for max_fail_percentage 12081 1726882412.85206: done checking for max_fail_percentage 12081 1726882412.85207: checking to see if all hosts have failed and the running result is not ok 12081 1726882412.85208: done checking to see if all hosts have failed 12081 1726882412.85209: getting the remaining hosts for this loop 12081 1726882412.85211: done getting the remaining hosts for this loop 12081 1726882412.85215: getting the next task for host managed_node3 12081 1726882412.85224: done getting next task for host managed_node3 12081 1726882412.85227: ^ task is: TASK: Cleanup 12081 1726882412.85230: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882412.85235: getting variables 12081 1726882412.85236: in VariableManager get_vars() 12081 1726882412.85275: Calling all_inventory to load vars for managed_node3 12081 1726882412.85278: Calling groups_inventory to load vars for managed_node3 12081 1726882412.85282: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.85292: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.85294: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.85297: Calling groups_plugins_play to load vars for managed_node3 12081 1726882412.86564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882412.88055: done with get_vars() 12081 1726882412.88095: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:33:32 -0400 (0:00:00.048) 0:00:32.685 ****** 12081 1726882412.88224: entering _queue_task() for managed_node3/include_tasks 12081 1726882412.88527: worker is 1 (out of 1 available) 12081 1726882412.88542: exiting _queue_task() for managed_node3/include_tasks 12081 1726882412.88557: done queuing things up, now waiting for results queue to drain 12081 1726882412.88559: waiting for pending results... 12081 1726882412.88902: running TaskExecutor() for managed_node3/TASK: Cleanup 12081 1726882412.89028: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000093 12081 1726882412.89055: variable 'ansible_search_path' from source: unknown 12081 1726882412.89074: variable 'ansible_search_path' from source: unknown 12081 1726882412.89131: variable 'lsr_cleanup' from source: include params 12081 1726882412.89394: variable 'lsr_cleanup' from source: include params 12081 1726882412.89449: variable 'omit' from source: magic vars 12081 1726882412.89560: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.89564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.89573: variable 'omit' from source: magic vars 12081 1726882412.89810: variable 'ansible_distribution_major_version' from source: facts 12081 1726882412.89818: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882412.89824: variable 'item' from source: unknown 12081 1726882412.89872: variable 'item' from source: unknown 12081 1726882412.89909: variable 'item' from source: unknown 12081 1726882412.89959: variable 'item' from source: unknown 12081 1726882412.90093: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882412.90097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882412.90100: variable 'omit' from source: magic vars 12081 1726882412.90194: variable 'ansible_distribution_major_version' from source: facts 12081 1726882412.90197: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882412.90203: variable 'item' from source: unknown 12081 1726882412.90266: variable 'item' from source: unknown 12081 1726882412.90302: variable 'item' from source: unknown 12081 1726882412.90354: variable 'item' from source: unknown 12081 1726882412.90418: dumping result to json 12081 1726882412.90421: done dumping result, returning 12081 1726882412.90423: done running TaskExecutor() for managed_node3/TASK: Cleanup [0e448fcc-3ce9-0a3f-ff3c-000000000093] 12081 1726882412.90430: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000093 12081 1726882412.90467: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000093 12081 1726882412.90470: WORKER PROCESS EXITING 12081 1726882412.90492: no more pending results, returning what we have 12081 1726882412.90497: in VariableManager get_vars() 12081 1726882412.90531: Calling all_inventory to load vars for managed_node3 12081 1726882412.90535: Calling groups_inventory to load vars for managed_node3 12081 1726882412.90541: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.90556: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.90559: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.90561: Calling groups_plugins_play to load vars for managed_node3 12081 1726882412.91495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882412.92632: done with get_vars() 12081 1726882412.92671: variable 'ansible_search_path' from source: unknown 12081 1726882412.92672: variable 'ansible_search_path' from source: unknown 12081 1726882412.92748: variable 'ansible_search_path' from source: unknown 12081 1726882412.92750: variable 'ansible_search_path' from source: unknown 12081 1726882412.92811: we have included files to process 12081 1726882412.92813: generating all_blocks data 12081 1726882412.92816: done generating all_blocks data 12081 1726882412.92828: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12081 1726882412.92829: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12081 1726882412.92832: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12081 1726882412.93175: in VariableManager get_vars() 12081 1726882412.93196: done with get_vars() 12081 1726882412.93201: variable 'omit' from source: magic vars 12081 1726882412.93239: variable 'omit' from source: magic vars 12081 1726882412.93339: in VariableManager get_vars() 12081 1726882412.93354: done with get_vars() 12081 1726882412.93426: in VariableManager get_vars() 12081 1726882412.93443: done with get_vars() 12081 1726882412.93504: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12081 1726882412.93804: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12081 1726882412.93924: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12081 1726882412.94435: in VariableManager get_vars() 12081 1726882412.94459: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882412.96318: done processing included file 12081 1726882412.96320: iterating over new_blocks loaded from include file 12081 1726882412.96321: in VariableManager get_vars() 12081 1726882412.96344: done with get_vars() 12081 1726882412.96346: filtering new block on tags 12081 1726882412.96606: done filtering new block on tags 12081 1726882412.96609: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node3 => (item=tasks/cleanup_bond_profile+device.yml) 12081 1726882412.96613: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12081 1726882412.96613: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12081 1726882412.96616: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12081 1726882412.97005: done processing included file 12081 1726882412.97007: iterating over new_blocks loaded from include file 12081 1726882412.97008: in VariableManager get_vars() 12081 1726882412.97019: done with get_vars() 12081 1726882412.97021: filtering new block on tags 12081 1726882412.97039: done filtering new block on tags 12081 1726882412.97040: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 12081 1726882412.97043: extending task lists for all hosts with included blocks 12081 1726882412.98959: done extending task lists 12081 1726882412.98961: done processing included files 12081 1726882412.98962: results queue empty 12081 1726882412.98962: checking for any_errors_fatal 12081 1726882412.98969: done checking for any_errors_fatal 12081 1726882412.98970: checking for max_fail_percentage 12081 1726882412.98971: done checking for max_fail_percentage 12081 1726882412.98972: checking to see if all hosts have failed and the running result is not ok 12081 1726882412.98973: done checking to see if all hosts have failed 12081 1726882412.98974: getting the remaining hosts for this loop 12081 1726882412.98975: done getting the remaining hosts for this loop 12081 1726882412.98982: getting the next task for host managed_node3 12081 1726882412.98989: done getting next task for host managed_node3 12081 1726882412.98998: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882412.99002: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882412.99017: getting variables 12081 1726882412.99018: in VariableManager get_vars() 12081 1726882412.99042: Calling all_inventory to load vars for managed_node3 12081 1726882412.99045: Calling groups_inventory to load vars for managed_node3 12081 1726882412.99048: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882412.99057: Calling all_plugins_play to load vars for managed_node3 12081 1726882412.99060: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882412.99065: Calling groups_plugins_play to load vars for managed_node3 12081 1726882413.02104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882413.06484: done with get_vars() 12081 1726882413.06519: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:33 -0400 (0:00:00.184) 0:00:32.870 ****** 12081 1726882413.06720: entering _queue_task() for managed_node3/include_tasks 12081 1726882413.07830: worker is 1 (out of 1 available) 12081 1726882413.07842: exiting _queue_task() for managed_node3/include_tasks 12081 1726882413.07855: done queuing things up, now waiting for results queue to drain 12081 1726882413.07857: waiting for pending results... 12081 1726882413.08687: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882413.09137: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000693 12081 1726882413.09143: variable 'ansible_search_path' from source: unknown 12081 1726882413.09146: variable 'ansible_search_path' from source: unknown 12081 1726882413.09202: calling self._execute() 12081 1726882413.09407: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882413.09412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882413.09416: variable 'omit' from source: magic vars 12081 1726882413.09857: variable 'ansible_distribution_major_version' from source: facts 12081 1726882413.09884: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882413.09903: _execute() done 12081 1726882413.09908: dumping result to json 12081 1726882413.09911: done dumping result, returning 12081 1726882413.09914: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-0a3f-ff3c-000000000693] 12081 1726882413.09916: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000693 12081 1726882413.10023: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000693 12081 1726882413.10026: WORKER PROCESS EXITING 12081 1726882413.10086: no more pending results, returning what we have 12081 1726882413.10093: in VariableManager get_vars() 12081 1726882413.10159: Calling all_inventory to load vars for managed_node3 12081 1726882413.10162: Calling groups_inventory to load vars for managed_node3 12081 1726882413.10167: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882413.10179: Calling all_plugins_play to load vars for managed_node3 12081 1726882413.10181: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882413.10183: Calling groups_plugins_play to load vars for managed_node3 12081 1726882413.12105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882413.14037: done with get_vars() 12081 1726882413.14067: variable 'ansible_search_path' from source: unknown 12081 1726882413.14069: variable 'ansible_search_path' from source: unknown 12081 1726882413.14113: we have included files to process 12081 1726882413.14114: generating all_blocks data 12081 1726882413.14116: done generating all_blocks data 12081 1726882413.14122: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882413.14123: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882413.14126: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882413.14743: done processing included file 12081 1726882413.14745: iterating over new_blocks loaded from include file 12081 1726882413.14747: in VariableManager get_vars() 12081 1726882413.14782: done with get_vars() 12081 1726882413.14784: filtering new block on tags 12081 1726882413.14820: done filtering new block on tags 12081 1726882413.14823: in VariableManager get_vars() 12081 1726882413.14848: done with get_vars() 12081 1726882413.14850: filtering new block on tags 12081 1726882413.14910: done filtering new block on tags 12081 1726882413.14913: in VariableManager get_vars() 12081 1726882413.14939: done with get_vars() 12081 1726882413.14941: filtering new block on tags 12081 1726882413.14988: done filtering new block on tags 12081 1726882413.14991: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12081 1726882413.15001: extending task lists for all hosts with included blocks 12081 1726882413.16911: done extending task lists 12081 1726882413.16913: done processing included files 12081 1726882413.16914: results queue empty 12081 1726882413.16914: checking for any_errors_fatal 12081 1726882413.16919: done checking for any_errors_fatal 12081 1726882413.16920: checking for max_fail_percentage 12081 1726882413.16921: done checking for max_fail_percentage 12081 1726882413.16922: checking to see if all hosts have failed and the running result is not ok 12081 1726882413.16923: done checking to see if all hosts have failed 12081 1726882413.16924: getting the remaining hosts for this loop 12081 1726882413.16925: done getting the remaining hosts for this loop 12081 1726882413.16928: getting the next task for host managed_node3 12081 1726882413.16933: done getting next task for host managed_node3 12081 1726882413.16936: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882413.16940: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882413.16950: getting variables 12081 1726882413.16952: in VariableManager get_vars() 12081 1726882413.16978: Calling all_inventory to load vars for managed_node3 12081 1726882413.16980: Calling groups_inventory to load vars for managed_node3 12081 1726882413.16983: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882413.16989: Calling all_plugins_play to load vars for managed_node3 12081 1726882413.16991: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882413.16994: Calling groups_plugins_play to load vars for managed_node3 12081 1726882413.18306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882413.20332: done with get_vars() 12081 1726882413.20366: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:33 -0400 (0:00:00.137) 0:00:33.007 ****** 12081 1726882413.20457: entering _queue_task() for managed_node3/setup 12081 1726882413.20820: worker is 1 (out of 1 available) 12081 1726882413.20833: exiting _queue_task() for managed_node3/setup 12081 1726882413.20846: done queuing things up, now waiting for results queue to drain 12081 1726882413.20847: waiting for pending results... 12081 1726882413.21153: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882413.21342: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000007c9 12081 1726882413.21365: variable 'ansible_search_path' from source: unknown 12081 1726882413.21374: variable 'ansible_search_path' from source: unknown 12081 1726882413.21423: calling self._execute() 12081 1726882413.21521: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882413.21534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882413.21546: variable 'omit' from source: magic vars 12081 1726882413.21930: variable 'ansible_distribution_major_version' from source: facts 12081 1726882413.21958: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882413.22201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882413.24873: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882413.24952: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882413.25005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882413.25043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882413.25082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882413.25169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882413.25213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882413.25245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882413.25295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882413.25323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882413.25382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882413.25416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882413.25451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882413.25499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882413.25523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882413.25703: variable '__network_required_facts' from source: role '' defaults 12081 1726882413.25718: variable 'ansible_facts' from source: unknown 12081 1726882413.26521: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12081 1726882413.26531: when evaluation is False, skipping this task 12081 1726882413.26538: _execute() done 12081 1726882413.26546: dumping result to json 12081 1726882413.26554: done dumping result, returning 12081 1726882413.26569: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-0a3f-ff3c-0000000007c9] 12081 1726882413.26580: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007c9 12081 1726882413.26699: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007c9 12081 1726882413.26706: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882413.26756: no more pending results, returning what we have 12081 1726882413.26760: results queue empty 12081 1726882413.26761: checking for any_errors_fatal 12081 1726882413.26762: done checking for any_errors_fatal 12081 1726882413.26763: checking for max_fail_percentage 12081 1726882413.26766: done checking for max_fail_percentage 12081 1726882413.26767: checking to see if all hosts have failed and the running result is not ok 12081 1726882413.26769: done checking to see if all hosts have failed 12081 1726882413.26769: getting the remaining hosts for this loop 12081 1726882413.26771: done getting the remaining hosts for this loop 12081 1726882413.26776: getting the next task for host managed_node3 12081 1726882413.26787: done getting next task for host managed_node3 12081 1726882413.26791: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882413.26798: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882413.26815: getting variables 12081 1726882413.26818: in VariableManager get_vars() 12081 1726882413.26857: Calling all_inventory to load vars for managed_node3 12081 1726882413.26861: Calling groups_inventory to load vars for managed_node3 12081 1726882413.26865: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882413.26878: Calling all_plugins_play to load vars for managed_node3 12081 1726882413.26881: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882413.26891: Calling groups_plugins_play to load vars for managed_node3 12081 1726882413.28757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882413.30554: done with get_vars() 12081 1726882413.30587: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:33 -0400 (0:00:00.102) 0:00:33.109 ****** 12081 1726882413.30699: entering _queue_task() for managed_node3/stat 12081 1726882413.31053: worker is 1 (out of 1 available) 12081 1726882413.31068: exiting _queue_task() for managed_node3/stat 12081 1726882413.31085: done queuing things up, now waiting for results queue to drain 12081 1726882413.31086: waiting for pending results... 12081 1726882413.31385: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882413.31577: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000007cb 12081 1726882413.31601: variable 'ansible_search_path' from source: unknown 12081 1726882413.31610: variable 'ansible_search_path' from source: unknown 12081 1726882413.31663: calling self._execute() 12081 1726882413.31771: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882413.31783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882413.31797: variable 'omit' from source: magic vars 12081 1726882413.32182: variable 'ansible_distribution_major_version' from source: facts 12081 1726882413.32200: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882413.32373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882413.32666: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882413.32718: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882413.32759: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882413.32799: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882413.32898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882413.32933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882413.32970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882413.33001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882413.33102: variable '__network_is_ostree' from source: set_fact 12081 1726882413.33114: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882413.33122: when evaluation is False, skipping this task 12081 1726882413.33128: _execute() done 12081 1726882413.33136: dumping result to json 12081 1726882413.33145: done dumping result, returning 12081 1726882413.33167: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-0a3f-ff3c-0000000007cb] 12081 1726882413.33179: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007cb skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882413.33339: no more pending results, returning what we have 12081 1726882413.33344: results queue empty 12081 1726882413.33345: checking for any_errors_fatal 12081 1726882413.33354: done checking for any_errors_fatal 12081 1726882413.33355: checking for max_fail_percentage 12081 1726882413.33357: done checking for max_fail_percentage 12081 1726882413.33358: checking to see if all hosts have failed and the running result is not ok 12081 1726882413.33359: done checking to see if all hosts have failed 12081 1726882413.33360: getting the remaining hosts for this loop 12081 1726882413.33362: done getting the remaining hosts for this loop 12081 1726882413.33368: getting the next task for host managed_node3 12081 1726882413.33377: done getting next task for host managed_node3 12081 1726882413.33381: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882413.33389: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882413.33407: getting variables 12081 1726882413.33410: in VariableManager get_vars() 12081 1726882413.33451: Calling all_inventory to load vars for managed_node3 12081 1726882413.33454: Calling groups_inventory to load vars for managed_node3 12081 1726882413.33456: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882413.33472: Calling all_plugins_play to load vars for managed_node3 12081 1726882413.33476: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882413.33479: Calling groups_plugins_play to load vars for managed_node3 12081 1726882413.34513: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007cb 12081 1726882413.34517: WORKER PROCESS EXITING 12081 1726882413.35272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882413.37217: done with get_vars() 12081 1726882413.37242: done getting variables 12081 1726882413.37311: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:33 -0400 (0:00:00.066) 0:00:33.176 ****** 12081 1726882413.37359: entering _queue_task() for managed_node3/set_fact 12081 1726882413.37736: worker is 1 (out of 1 available) 12081 1726882413.37748: exiting _queue_task() for managed_node3/set_fact 12081 1726882413.37761: done queuing things up, now waiting for results queue to drain 12081 1726882413.37766: waiting for pending results... 12081 1726882413.38077: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882413.38269: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000007cc 12081 1726882413.38290: variable 'ansible_search_path' from source: unknown 12081 1726882413.38299: variable 'ansible_search_path' from source: unknown 12081 1726882413.38349: calling self._execute() 12081 1726882413.38457: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882413.38471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882413.38485: variable 'omit' from source: magic vars 12081 1726882413.38863: variable 'ansible_distribution_major_version' from source: facts 12081 1726882413.38887: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882413.39058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882413.39345: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882413.39514: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882413.39559: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882413.39600: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882413.39700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882413.39729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882413.39770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882413.39801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882413.40170: variable '__network_is_ostree' from source: set_fact 12081 1726882413.40182: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882413.40189: when evaluation is False, skipping this task 12081 1726882413.40198: _execute() done 12081 1726882413.40209: dumping result to json 12081 1726882413.40218: done dumping result, returning 12081 1726882413.40229: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-0a3f-ff3c-0000000007cc] 12081 1726882413.40239: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007cc skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882413.40549: no more pending results, returning what we have 12081 1726882413.40554: results queue empty 12081 1726882413.40555: checking for any_errors_fatal 12081 1726882413.40593: done checking for any_errors_fatal 12081 1726882413.40594: checking for max_fail_percentage 12081 1726882413.40596: done checking for max_fail_percentage 12081 1726882413.40597: checking to see if all hosts have failed and the running result is not ok 12081 1726882413.40598: done checking to see if all hosts have failed 12081 1726882413.40680: getting the remaining hosts for this loop 12081 1726882413.40694: done getting the remaining hosts for this loop 12081 1726882413.40726: getting the next task for host managed_node3 12081 1726882413.40837: done getting next task for host managed_node3 12081 1726882413.40935: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882413.40989: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882413.41104: getting variables 12081 1726882413.41107: in VariableManager get_vars() 12081 1726882413.41149: Calling all_inventory to load vars for managed_node3 12081 1726882413.41152: Calling groups_inventory to load vars for managed_node3 12081 1726882413.41155: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882413.41169: Calling all_plugins_play to load vars for managed_node3 12081 1726882413.41172: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882413.41175: Calling groups_plugins_play to load vars for managed_node3 12081 1726882413.42074: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007cc 12081 1726882413.42078: WORKER PROCESS EXITING 12081 1726882413.43901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882413.48021: done with get_vars() 12081 1726882413.48057: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:33 -0400 (0:00:00.108) 0:00:33.284 ****** 12081 1726882413.48184: entering _queue_task() for managed_node3/service_facts 12081 1726882413.48572: worker is 1 (out of 1 available) 12081 1726882413.48590: exiting _queue_task() for managed_node3/service_facts 12081 1726882413.48605: done queuing things up, now waiting for results queue to drain 12081 1726882413.48606: waiting for pending results... 12081 1726882413.48952: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882413.49151: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000007ce 12081 1726882413.49169: variable 'ansible_search_path' from source: unknown 12081 1726882413.49175: variable 'ansible_search_path' from source: unknown 12081 1726882413.49233: calling self._execute() 12081 1726882413.49338: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882413.49342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882413.49352: variable 'omit' from source: magic vars 12081 1726882413.49809: variable 'ansible_distribution_major_version' from source: facts 12081 1726882413.49867: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882413.49872: variable 'omit' from source: magic vars 12081 1726882413.49947: variable 'omit' from source: magic vars 12081 1726882413.49987: variable 'omit' from source: magic vars 12081 1726882413.50051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882413.50093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882413.50112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882413.50129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882413.50140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882413.50176: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882413.50179: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882413.50182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882413.50288: Set connection var ansible_pipelining to False 12081 1726882413.50293: Set connection var ansible_shell_type to sh 12081 1726882413.50298: Set connection var ansible_shell_executable to /bin/sh 12081 1726882413.50301: Set connection var ansible_connection to ssh 12081 1726882413.50306: Set connection var ansible_timeout to 10 12081 1726882413.50311: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882413.50335: variable 'ansible_shell_executable' from source: unknown 12081 1726882413.50339: variable 'ansible_connection' from source: unknown 12081 1726882413.50342: variable 'ansible_module_compression' from source: unknown 12081 1726882413.50344: variable 'ansible_shell_type' from source: unknown 12081 1726882413.50346: variable 'ansible_shell_executable' from source: unknown 12081 1726882413.50349: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882413.50351: variable 'ansible_pipelining' from source: unknown 12081 1726882413.50358: variable 'ansible_timeout' from source: unknown 12081 1726882413.50360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882413.50570: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882413.50579: variable 'omit' from source: magic vars 12081 1726882413.50584: starting attempt loop 12081 1726882413.50587: running the handler 12081 1726882413.50606: _low_level_execute_command(): starting 12081 1726882413.50615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882413.51484: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882413.51488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.51491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.51494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.51496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.51499: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882413.51501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.51630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882413.51634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882413.51636: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882413.51639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.51641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.51643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.51655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.51662: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882413.51666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.51668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882413.51675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882413.51678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882413.51826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882413.53513: stdout chunk (state=3): >>>/root <<< 12081 1726882413.53689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882413.53693: stdout chunk (state=3): >>><<< 12081 1726882413.53695: stderr chunk (state=3): >>><<< 12081 1726882413.53716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882413.53730: _low_level_execute_command(): starting 12081 1726882413.53738: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429 `" && echo ansible-tmp-1726882413.537164-13630-160158222342429="` echo /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429 `" ) && sleep 0' 12081 1726882413.55356: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882413.55369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.55380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.55395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.55437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.55570: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882413.55682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.55696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882413.55703: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882413.55710: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882413.55718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.55728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.55739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.55746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.55753: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882413.55768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.55854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882413.55857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882413.55903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882413.56136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882413.58017: stdout chunk (state=3): >>>ansible-tmp-1726882413.537164-13630-160158222342429=/root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429 <<< 12081 1726882413.58195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882413.58199: stdout chunk (state=3): >>><<< 12081 1726882413.58207: stderr chunk (state=3): >>><<< 12081 1726882413.58226: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882413.537164-13630-160158222342429=/root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882413.58280: variable 'ansible_module_compression' from source: unknown 12081 1726882413.58328: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12081 1726882413.58370: variable 'ansible_facts' from source: unknown 12081 1726882413.58441: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429/AnsiballZ_service_facts.py 12081 1726882413.59437: Sending initial data 12081 1726882413.59440: Sent initial data (161 bytes) 12081 1726882413.62142: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882413.62277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.62287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.62301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.62410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.62418: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882413.62428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.62442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882413.62449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882413.62460: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882413.62470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.62481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.62497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.62504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.62511: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882413.62521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.62595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882413.62726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882413.62733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882413.62942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882413.64702: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882413.64800: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882413.64900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpwo91yjn2 /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429/AnsiballZ_service_facts.py <<< 12081 1726882413.64995: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882413.66424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882413.66517: stderr chunk (state=3): >>><<< 12081 1726882413.66520: stdout chunk (state=3): >>><<< 12081 1726882413.66543: done transferring module to remote 12081 1726882413.66553: _low_level_execute_command(): starting 12081 1726882413.66560: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429/ /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429/AnsiballZ_service_facts.py && sleep 0' 12081 1726882413.68206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882413.68214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.68224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.68238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.68289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.68295: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882413.68304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.68316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882413.68323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882413.68329: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882413.68336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.68343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.68353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.68365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.68372: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882413.68382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.68460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882413.68585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882413.68595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882413.68834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882413.70487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882413.70565: stderr chunk (state=3): >>><<< 12081 1726882413.70569: stdout chunk (state=3): >>><<< 12081 1726882413.70589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882413.70592: _low_level_execute_command(): starting 12081 1726882413.70598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429/AnsiballZ_service_facts.py && sleep 0' 12081 1726882413.72300: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882413.72309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.72325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.72339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.72440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.72447: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882413.72460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.72476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882413.72483: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882413.72490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882413.72498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882413.72507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882413.72519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882413.72526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882413.72534: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882413.72548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882413.72711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882413.72725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882413.72731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882413.72982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.03786: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source"<<< 12081 1726882415.03816: stdout chunk (state=3): >>>: "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "syst<<< 12081 1726882415.03821: stdout chunk (state=3): >>>emd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"<<< 12081 1726882415.03843: stdout chunk (state=3): >>>name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12081 1726882415.05174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882415.05177: stdout chunk (state=3): >>><<< 12081 1726882415.05184: stderr chunk (state=3): >>><<< 12081 1726882415.05247: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882415.06247: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882415.06408: _low_level_execute_command(): starting 12081 1726882415.06428: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882413.537164-13630-160158222342429/ > /dev/null 2>&1 && sleep 0' 12081 1726882415.10310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882415.10320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.10337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.10349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.10390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882415.10397: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882415.10406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.10421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882415.10427: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882415.10444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882415.10452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.10469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.10478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.10485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882415.10492: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882415.10501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.10583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882415.10598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882415.10602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882415.10732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.12634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882415.12638: stdout chunk (state=3): >>><<< 12081 1726882415.12640: stderr chunk (state=3): >>><<< 12081 1726882415.12672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882415.12675: handler run complete 12081 1726882415.12866: variable 'ansible_facts' from source: unknown 12081 1726882415.13037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882415.13723: variable 'ansible_facts' from source: unknown 12081 1726882415.13866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882415.14089: attempt loop complete, returning result 12081 1726882415.14101: _execute() done 12081 1726882415.14104: dumping result to json 12081 1726882415.14184: done dumping result, returning 12081 1726882415.14194: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-0a3f-ff3c-0000000007ce] 12081 1726882415.14200: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007ce ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882415.15655: no more pending results, returning what we have 12081 1726882415.15658: results queue empty 12081 1726882415.15659: checking for any_errors_fatal 12081 1726882415.15666: done checking for any_errors_fatal 12081 1726882415.15666: checking for max_fail_percentage 12081 1726882415.15668: done checking for max_fail_percentage 12081 1726882415.15669: checking to see if all hosts have failed and the running result is not ok 12081 1726882415.15670: done checking to see if all hosts have failed 12081 1726882415.15670: getting the remaining hosts for this loop 12081 1726882415.15672: done getting the remaining hosts for this loop 12081 1726882415.15677: getting the next task for host managed_node3 12081 1726882415.15683: done getting next task for host managed_node3 12081 1726882415.15686: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882415.15691: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882415.15703: getting variables 12081 1726882415.15705: in VariableManager get_vars() 12081 1726882415.15750: Calling all_inventory to load vars for managed_node3 12081 1726882415.15755: Calling groups_inventory to load vars for managed_node3 12081 1726882415.15757: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882415.15771: Calling all_plugins_play to load vars for managed_node3 12081 1726882415.15773: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882415.15779: Calling groups_plugins_play to load vars for managed_node3 12081 1726882415.16302: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007ce 12081 1726882415.16305: WORKER PROCESS EXITING 12081 1726882415.17248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882415.19191: done with get_vars() 12081 1726882415.19216: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:35 -0400 (0:00:01.711) 0:00:34.996 ****** 12081 1726882415.19323: entering _queue_task() for managed_node3/package_facts 12081 1726882415.19683: worker is 1 (out of 1 available) 12081 1726882415.19696: exiting _queue_task() for managed_node3/package_facts 12081 1726882415.19709: done queuing things up, now waiting for results queue to drain 12081 1726882415.19711: waiting for pending results... 12081 1726882415.20427: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882415.20607: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000007cf 12081 1726882415.20628: variable 'ansible_search_path' from source: unknown 12081 1726882415.20636: variable 'ansible_search_path' from source: unknown 12081 1726882415.20685: calling self._execute() 12081 1726882415.20791: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882415.20802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882415.20823: variable 'omit' from source: magic vars 12081 1726882415.21202: variable 'ansible_distribution_major_version' from source: facts 12081 1726882415.21221: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882415.21234: variable 'omit' from source: magic vars 12081 1726882415.21341: variable 'omit' from source: magic vars 12081 1726882415.21386: variable 'omit' from source: magic vars 12081 1726882415.21432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882415.21479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882415.21503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882415.21525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882415.21540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882415.21577: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882415.21589: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882415.21596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882415.21702: Set connection var ansible_pipelining to False 12081 1726882415.21710: Set connection var ansible_shell_type to sh 12081 1726882415.21721: Set connection var ansible_shell_executable to /bin/sh 12081 1726882415.21726: Set connection var ansible_connection to ssh 12081 1726882415.21736: Set connection var ansible_timeout to 10 12081 1726882415.21744: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882415.21849: variable 'ansible_shell_executable' from source: unknown 12081 1726882415.21896: variable 'ansible_connection' from source: unknown 12081 1726882415.21904: variable 'ansible_module_compression' from source: unknown 12081 1726882415.21914: variable 'ansible_shell_type' from source: unknown 12081 1726882415.21953: variable 'ansible_shell_executable' from source: unknown 12081 1726882415.22012: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882415.22020: variable 'ansible_pipelining' from source: unknown 12081 1726882415.22031: variable 'ansible_timeout' from source: unknown 12081 1726882415.22038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882415.22304: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882415.22320: variable 'omit' from source: magic vars 12081 1726882415.22329: starting attempt loop 12081 1726882415.22335: running the handler 12081 1726882415.22359: _low_level_execute_command(): starting 12081 1726882415.22376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882415.23498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.23503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.23552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.23556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.23558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882415.23562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.23629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882415.23633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882415.23635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882415.23756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.25419: stdout chunk (state=3): >>>/root <<< 12081 1726882415.26306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882415.26309: stdout chunk (state=3): >>><<< 12081 1726882415.26312: stderr chunk (state=3): >>><<< 12081 1726882415.26437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882415.26441: _low_level_execute_command(): starting 12081 1726882415.26444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730 `" && echo ansible-tmp-1726882415.263356-13693-189617076984730="` echo /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730 `" ) && sleep 0' 12081 1726882415.27218: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882415.27236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.27251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.27270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.27319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882415.27331: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882415.27347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.27369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882415.27383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882415.27396: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882415.27411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.27424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.27438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.27454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882415.27467: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882415.27480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.27560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882415.27579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882415.27593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882415.27741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.29641: stdout chunk (state=3): >>>ansible-tmp-1726882415.263356-13693-189617076984730=/root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730 <<< 12081 1726882415.29782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882415.29850: stderr chunk (state=3): >>><<< 12081 1726882415.29854: stdout chunk (state=3): >>><<< 12081 1726882415.29975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882415.263356-13693-189617076984730=/root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882415.29978: variable 'ansible_module_compression' from source: unknown 12081 1726882415.30084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12081 1726882415.30087: variable 'ansible_facts' from source: unknown 12081 1726882415.30313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730/AnsiballZ_package_facts.py 12081 1726882415.31094: Sending initial data 12081 1726882415.31097: Sent initial data (161 bytes) 12081 1726882415.32058: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882415.32061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.32069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.32104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.32108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.32111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.32184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882415.32187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882415.32191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882415.32302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.34050: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882415.34147: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882415.34249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpaymiaf55 /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730/AnsiballZ_package_facts.py <<< 12081 1726882415.34345: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882415.37151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882415.37242: stderr chunk (state=3): >>><<< 12081 1726882415.37245: stdout chunk (state=3): >>><<< 12081 1726882415.37270: done transferring module to remote 12081 1726882415.37284: _low_level_execute_command(): starting 12081 1726882415.37288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730/ /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730/AnsiballZ_package_facts.py && sleep 0' 12081 1726882415.38316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882415.38319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.38322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.38324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.38326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882415.38328: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882415.38337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.38341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882415.38343: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882415.38345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882415.38348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.38349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.38356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.38358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882415.38360: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882415.38362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.38366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882415.38368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882415.38370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882415.38420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.40247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882415.40251: stdout chunk (state=3): >>><<< 12081 1726882415.40256: stderr chunk (state=3): >>><<< 12081 1726882415.40347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882415.40351: _low_level_execute_command(): starting 12081 1726882415.40357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730/AnsiballZ_package_facts.py && sleep 0' 12081 1726882415.40863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.40871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.40907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.40913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.40941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.40944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882415.40946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.40996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882415.40999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882415.41111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.87100: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null,<<< 12081 1726882415.87126: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "<<< 12081 1726882415.87153: stdout chunk (state=3): >>>rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": <<< 12081 1726882415.87157: stdout chunk (state=3): >>>"7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 12081 1726882415.87210: stdout chunk (state=3): >>>libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1<<< 12081 1726882415.87214: stdout chunk (state=3): >>>.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", <<< 12081 1726882415.87224: stdout chunk (state=3): >>>"release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-bas<<< 12081 1726882415.87227: stdout chunk (state=3): >>>e-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch<<< 12081 1726882415.87233: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source"<<< 12081 1726882415.87235: stdout chunk (state=3): >>>: "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "ar<<< 12081 1726882415.87239: stdout chunk (state=3): >>>ch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", <<< 12081 1726882415.87241: stdout chunk (state=3): >>>"release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "<<< 12081 1726882415.87252: stdout chunk (state=3): >>>version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64",<<< 12081 1726882415.87278: stdout chunk (state=3): >>> "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "re<<< 12081 1726882415.87294: stdout chunk (state=3): >>>lease": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12081 1726882415.88861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882415.88891: stderr chunk (state=3): >>><<< 12081 1726882415.88894: stdout chunk (state=3): >>><<< 12081 1726882415.88945: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882415.91068: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882415.91087: _low_level_execute_command(): starting 12081 1726882415.91091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882415.263356-13693-189617076984730/ > /dev/null 2>&1 && sleep 0' 12081 1726882415.91804: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882415.91807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.91827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882415.91931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.91935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882415.91974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882415.91977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882415.92071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882415.92097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882415.92202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882415.94036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882415.94142: stderr chunk (state=3): >>><<< 12081 1726882415.94148: stdout chunk (state=3): >>><<< 12081 1726882415.94163: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882415.94173: handler run complete 12081 1726882415.95080: variable 'ansible_facts' from source: unknown 12081 1726882415.95500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882415.96904: variable 'ansible_facts' from source: unknown 12081 1726882415.97184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882415.97633: attempt loop complete, returning result 12081 1726882415.97643: _execute() done 12081 1726882415.97647: dumping result to json 12081 1726882415.97779: done dumping result, returning 12081 1726882415.97787: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-0a3f-ff3c-0000000007cf] 12081 1726882415.97794: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007cf ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882415.99267: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000007cf 12081 1726882415.99273: WORKER PROCESS EXITING 12081 1726882415.99280: no more pending results, returning what we have 12081 1726882415.99282: results queue empty 12081 1726882415.99283: checking for any_errors_fatal 12081 1726882415.99286: done checking for any_errors_fatal 12081 1726882415.99286: checking for max_fail_percentage 12081 1726882415.99287: done checking for max_fail_percentage 12081 1726882415.99288: checking to see if all hosts have failed and the running result is not ok 12081 1726882415.99289: done checking to see if all hosts have failed 12081 1726882415.99289: getting the remaining hosts for this loop 12081 1726882415.99290: done getting the remaining hosts for this loop 12081 1726882415.99293: getting the next task for host managed_node3 12081 1726882415.99298: done getting next task for host managed_node3 12081 1726882415.99301: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882415.99305: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882415.99311: getting variables 12081 1726882415.99312: in VariableManager get_vars() 12081 1726882415.99336: Calling all_inventory to load vars for managed_node3 12081 1726882415.99338: Calling groups_inventory to load vars for managed_node3 12081 1726882415.99339: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882415.99346: Calling all_plugins_play to load vars for managed_node3 12081 1726882415.99348: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882415.99350: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.00228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.01577: done with get_vars() 12081 1726882416.01614: done getting variables 12081 1726882416.01666: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:36 -0400 (0:00:00.823) 0:00:35.819 ****** 12081 1726882416.01693: entering _queue_task() for managed_node3/debug 12081 1726882416.02024: worker is 1 (out of 1 available) 12081 1726882416.02038: exiting _queue_task() for managed_node3/debug 12081 1726882416.02054: done queuing things up, now waiting for results queue to drain 12081 1726882416.02057: waiting for pending results... 12081 1726882416.02331: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882416.02440: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000694 12081 1726882416.02449: variable 'ansible_search_path' from source: unknown 12081 1726882416.02455: variable 'ansible_search_path' from source: unknown 12081 1726882416.02494: calling self._execute() 12081 1726882416.02567: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.02571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.02579: variable 'omit' from source: magic vars 12081 1726882416.02894: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.02911: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.02917: variable 'omit' from source: magic vars 12081 1726882416.03030: variable 'omit' from source: magic vars 12081 1726882416.03120: variable 'network_provider' from source: set_fact 12081 1726882416.03145: variable 'omit' from source: magic vars 12081 1726882416.03193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882416.03229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882416.03242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882416.03257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882416.03268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882416.03294: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882416.03297: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.03301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.03378: Set connection var ansible_pipelining to False 12081 1726882416.03382: Set connection var ansible_shell_type to sh 12081 1726882416.03387: Set connection var ansible_shell_executable to /bin/sh 12081 1726882416.03389: Set connection var ansible_connection to ssh 12081 1726882416.03399: Set connection var ansible_timeout to 10 12081 1726882416.03404: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882416.03426: variable 'ansible_shell_executable' from source: unknown 12081 1726882416.03432: variable 'ansible_connection' from source: unknown 12081 1726882416.03434: variable 'ansible_module_compression' from source: unknown 12081 1726882416.03438: variable 'ansible_shell_type' from source: unknown 12081 1726882416.03491: variable 'ansible_shell_executable' from source: unknown 12081 1726882416.03494: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.03497: variable 'ansible_pipelining' from source: unknown 12081 1726882416.03499: variable 'ansible_timeout' from source: unknown 12081 1726882416.03503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.03699: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882416.03708: variable 'omit' from source: magic vars 12081 1726882416.03712: starting attempt loop 12081 1726882416.03716: running the handler 12081 1726882416.03776: handler run complete 12081 1726882416.03779: attempt loop complete, returning result 12081 1726882416.03783: _execute() done 12081 1726882416.03786: dumping result to json 12081 1726882416.03788: done dumping result, returning 12081 1726882416.03791: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-0a3f-ff3c-000000000694] 12081 1726882416.03799: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000694 12081 1726882416.03933: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000694 12081 1726882416.03938: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 12081 1726882416.04043: no more pending results, returning what we have 12081 1726882416.04046: results queue empty 12081 1726882416.04047: checking for any_errors_fatal 12081 1726882416.04058: done checking for any_errors_fatal 12081 1726882416.04058: checking for max_fail_percentage 12081 1726882416.04060: done checking for max_fail_percentage 12081 1726882416.04061: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.04062: done checking to see if all hosts have failed 12081 1726882416.04062: getting the remaining hosts for this loop 12081 1726882416.04065: done getting the remaining hosts for this loop 12081 1726882416.04068: getting the next task for host managed_node3 12081 1726882416.04074: done getting next task for host managed_node3 12081 1726882416.04078: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882416.04087: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.04101: getting variables 12081 1726882416.04103: in VariableManager get_vars() 12081 1726882416.04138: Calling all_inventory to load vars for managed_node3 12081 1726882416.04140: Calling groups_inventory to load vars for managed_node3 12081 1726882416.04142: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.04149: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.04150: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.04156: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.05110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.06537: done with get_vars() 12081 1726882416.06571: done getting variables 12081 1726882416.06635: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:36 -0400 (0:00:00.049) 0:00:35.869 ****** 12081 1726882416.06685: entering _queue_task() for managed_node3/fail 12081 1726882416.07088: worker is 1 (out of 1 available) 12081 1726882416.07101: exiting _queue_task() for managed_node3/fail 12081 1726882416.07115: done queuing things up, now waiting for results queue to drain 12081 1726882416.07116: waiting for pending results... 12081 1726882416.07318: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882416.07427: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000695 12081 1726882416.07439: variable 'ansible_search_path' from source: unknown 12081 1726882416.07442: variable 'ansible_search_path' from source: unknown 12081 1726882416.07477: calling self._execute() 12081 1726882416.07543: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.07547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.07593: variable 'omit' from source: magic vars 12081 1726882416.07923: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.07932: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.08020: variable 'network_state' from source: role '' defaults 12081 1726882416.08030: Evaluated conditional (network_state != {}): False 12081 1726882416.08033: when evaluation is False, skipping this task 12081 1726882416.08038: _execute() done 12081 1726882416.08040: dumping result to json 12081 1726882416.08042: done dumping result, returning 12081 1726882416.08045: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-0a3f-ff3c-000000000695] 12081 1726882416.08053: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000695 12081 1726882416.08172: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000695 12081 1726882416.08174: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882416.08221: no more pending results, returning what we have 12081 1726882416.08225: results queue empty 12081 1726882416.08226: checking for any_errors_fatal 12081 1726882416.08232: done checking for any_errors_fatal 12081 1726882416.08237: checking for max_fail_percentage 12081 1726882416.08239: done checking for max_fail_percentage 12081 1726882416.08240: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.08241: done checking to see if all hosts have failed 12081 1726882416.08241: getting the remaining hosts for this loop 12081 1726882416.08243: done getting the remaining hosts for this loop 12081 1726882416.08250: getting the next task for host managed_node3 12081 1726882416.08258: done getting next task for host managed_node3 12081 1726882416.08262: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882416.08268: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.08288: getting variables 12081 1726882416.08290: in VariableManager get_vars() 12081 1726882416.08320: Calling all_inventory to load vars for managed_node3 12081 1726882416.08323: Calling groups_inventory to load vars for managed_node3 12081 1726882416.08325: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.08333: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.08336: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.08338: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.09341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.12444: done with get_vars() 12081 1726882416.12539: done getting variables 12081 1726882416.12655: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:36 -0400 (0:00:00.060) 0:00:35.929 ****** 12081 1726882416.12705: entering _queue_task() for managed_node3/fail 12081 1726882416.13107: worker is 1 (out of 1 available) 12081 1726882416.13119: exiting _queue_task() for managed_node3/fail 12081 1726882416.13139: done queuing things up, now waiting for results queue to drain 12081 1726882416.13141: waiting for pending results... 12081 1726882416.13871: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882416.14244: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000696 12081 1726882416.14324: variable 'ansible_search_path' from source: unknown 12081 1726882416.14388: variable 'ansible_search_path' from source: unknown 12081 1726882416.14534: calling self._execute() 12081 1726882416.14851: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.14866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.14882: variable 'omit' from source: magic vars 12081 1726882416.15489: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.15508: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.16047: variable 'network_state' from source: role '' defaults 12081 1726882416.16066: Evaluated conditional (network_state != {}): False 12081 1726882416.16074: when evaluation is False, skipping this task 12081 1726882416.16081: _execute() done 12081 1726882416.16087: dumping result to json 12081 1726882416.16097: done dumping result, returning 12081 1726882416.16163: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-0a3f-ff3c-000000000696] 12081 1726882416.16208: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000696 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882416.16456: no more pending results, returning what we have 12081 1726882416.16461: results queue empty 12081 1726882416.16462: checking for any_errors_fatal 12081 1726882416.16474: done checking for any_errors_fatal 12081 1726882416.16475: checking for max_fail_percentage 12081 1726882416.16477: done checking for max_fail_percentage 12081 1726882416.16478: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.16480: done checking to see if all hosts have failed 12081 1726882416.16480: getting the remaining hosts for this loop 12081 1726882416.16482: done getting the remaining hosts for this loop 12081 1726882416.16487: getting the next task for host managed_node3 12081 1726882416.16495: done getting next task for host managed_node3 12081 1726882416.16500: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882416.16509: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.16531: getting variables 12081 1726882416.16533: in VariableManager get_vars() 12081 1726882416.16578: Calling all_inventory to load vars for managed_node3 12081 1726882416.16582: Calling groups_inventory to load vars for managed_node3 12081 1726882416.16584: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.16598: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.16601: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.16604: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.17680: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000696 12081 1726882416.17684: WORKER PROCESS EXITING 12081 1726882416.22770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.23917: done with get_vars() 12081 1726882416.23945: done getting variables 12081 1726882416.23996: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:36 -0400 (0:00:00.113) 0:00:36.043 ****** 12081 1726882416.24028: entering _queue_task() for managed_node3/fail 12081 1726882416.24392: worker is 1 (out of 1 available) 12081 1726882416.24406: exiting _queue_task() for managed_node3/fail 12081 1726882416.24423: done queuing things up, now waiting for results queue to drain 12081 1726882416.24430: waiting for pending results... 12081 1726882416.24836: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882416.24996: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000697 12081 1726882416.25012: variable 'ansible_search_path' from source: unknown 12081 1726882416.25016: variable 'ansible_search_path' from source: unknown 12081 1726882416.25053: calling self._execute() 12081 1726882416.25152: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.25167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.25175: variable 'omit' from source: magic vars 12081 1726882416.25571: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.25584: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.25775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882416.29979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882416.30066: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882416.30105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882416.30140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882416.30172: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882416.30255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.30293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.30318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.30366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.30380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.30509: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.30526: Evaluated conditional (ansible_distribution_major_version | int > 9): False 12081 1726882416.30529: when evaluation is False, skipping this task 12081 1726882416.30532: _execute() done 12081 1726882416.30535: dumping result to json 12081 1726882416.30537: done dumping result, returning 12081 1726882416.30545: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-0a3f-ff3c-000000000697] 12081 1726882416.30552: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000697 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 12081 1726882416.30722: no more pending results, returning what we have 12081 1726882416.30727: results queue empty 12081 1726882416.30727: checking for any_errors_fatal 12081 1726882416.30740: done checking for any_errors_fatal 12081 1726882416.30741: checking for max_fail_percentage 12081 1726882416.30743: done checking for max_fail_percentage 12081 1726882416.30744: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.30746: done checking to see if all hosts have failed 12081 1726882416.30747: getting the remaining hosts for this loop 12081 1726882416.30749: done getting the remaining hosts for this loop 12081 1726882416.30756: getting the next task for host managed_node3 12081 1726882416.30767: done getting next task for host managed_node3 12081 1726882416.30772: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882416.30778: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.30798: getting variables 12081 1726882416.30800: in VariableManager get_vars() 12081 1726882416.30842: Calling all_inventory to load vars for managed_node3 12081 1726882416.30846: Calling groups_inventory to load vars for managed_node3 12081 1726882416.30849: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.30867: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.30871: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.30874: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.31995: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000697 12081 1726882416.31999: WORKER PROCESS EXITING 12081 1726882416.32694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.35041: done with get_vars() 12081 1726882416.35079: done getting variables 12081 1726882416.35140: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:36 -0400 (0:00:00.111) 0:00:36.154 ****** 12081 1726882416.35181: entering _queue_task() for managed_node3/dnf 12081 1726882416.35518: worker is 1 (out of 1 available) 12081 1726882416.35529: exiting _queue_task() for managed_node3/dnf 12081 1726882416.35541: done queuing things up, now waiting for results queue to drain 12081 1726882416.35542: waiting for pending results... 12081 1726882416.35983: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882416.36174: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000698 12081 1726882416.36194: variable 'ansible_search_path' from source: unknown 12081 1726882416.36201: variable 'ansible_search_path' from source: unknown 12081 1726882416.36247: calling self._execute() 12081 1726882416.36355: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.36370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.36384: variable 'omit' from source: magic vars 12081 1726882416.37226: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.37247: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.37592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882416.41317: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882416.41395: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882416.41447: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882416.41495: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882416.41527: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882416.41615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.41649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.41690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.41735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.41759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.41893: variable 'ansible_distribution' from source: facts 12081 1726882416.41906: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.41926: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12081 1726882416.42059: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882416.42201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.42234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.42271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.42316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.42461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.42508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.42537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.42694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.42739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.42762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.42813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.42898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.43015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.43066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.43087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.43448: variable 'network_connections' from source: task vars 12081 1726882416.43495: variable 'port2_profile' from source: play vars 12081 1726882416.43643: variable 'port2_profile' from source: play vars 12081 1726882416.43682: variable 'port1_profile' from source: play vars 12081 1726882416.43794: variable 'port1_profile' from source: play vars 12081 1726882416.43868: variable 'controller_profile' from source: play vars 12081 1726882416.44003: variable 'controller_profile' from source: play vars 12081 1726882416.44113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882416.44326: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882416.44380: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882416.44432: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882416.44477: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882416.44530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882416.44571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882416.44608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.44645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882416.44709: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882416.44983: variable 'network_connections' from source: task vars 12081 1726882416.44993: variable 'port2_profile' from source: play vars 12081 1726882416.45071: variable 'port2_profile' from source: play vars 12081 1726882416.45085: variable 'port1_profile' from source: play vars 12081 1726882416.45156: variable 'port1_profile' from source: play vars 12081 1726882416.45172: variable 'controller_profile' from source: play vars 12081 1726882416.45240: variable 'controller_profile' from source: play vars 12081 1726882416.45277: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882416.45288: when evaluation is False, skipping this task 12081 1726882416.45296: _execute() done 12081 1726882416.45303: dumping result to json 12081 1726882416.45311: done dumping result, returning 12081 1726882416.45322: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000698] 12081 1726882416.45334: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000698 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882416.45520: no more pending results, returning what we have 12081 1726882416.45525: results queue empty 12081 1726882416.45526: checking for any_errors_fatal 12081 1726882416.45533: done checking for any_errors_fatal 12081 1726882416.45534: checking for max_fail_percentage 12081 1726882416.45536: done checking for max_fail_percentage 12081 1726882416.45537: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.45539: done checking to see if all hosts have failed 12081 1726882416.45539: getting the remaining hosts for this loop 12081 1726882416.45541: done getting the remaining hosts for this loop 12081 1726882416.45545: getting the next task for host managed_node3 12081 1726882416.45558: done getting next task for host managed_node3 12081 1726882416.45565: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882416.45571: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.45590: getting variables 12081 1726882416.45593: in VariableManager get_vars() 12081 1726882416.45633: Calling all_inventory to load vars for managed_node3 12081 1726882416.45636: Calling groups_inventory to load vars for managed_node3 12081 1726882416.45638: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.45650: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.45655: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.45658: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.47325: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000698 12081 1726882416.47331: WORKER PROCESS EXITING 12081 1726882416.48222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.49655: done with get_vars() 12081 1726882416.49676: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882416.49733: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:36 -0400 (0:00:00.145) 0:00:36.300 ****** 12081 1726882416.49761: entering _queue_task() for managed_node3/yum 12081 1726882416.49999: worker is 1 (out of 1 available) 12081 1726882416.50012: exiting _queue_task() for managed_node3/yum 12081 1726882416.50024: done queuing things up, now waiting for results queue to drain 12081 1726882416.50026: waiting for pending results... 12081 1726882416.50215: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882416.50332: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000699 12081 1726882416.50341: variable 'ansible_search_path' from source: unknown 12081 1726882416.50344: variable 'ansible_search_path' from source: unknown 12081 1726882416.50378: calling self._execute() 12081 1726882416.50445: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.50450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.50457: variable 'omit' from source: magic vars 12081 1726882416.50740: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.50751: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.50889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882416.54484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882416.54557: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882416.54590: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882416.54628: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882416.54656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882416.54732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.54763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.54787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.54827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.54842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.54939: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.54956: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12081 1726882416.54960: when evaluation is False, skipping this task 12081 1726882416.54965: _execute() done 12081 1726882416.54967: dumping result to json 12081 1726882416.54969: done dumping result, returning 12081 1726882416.54980: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000699] 12081 1726882416.54985: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000699 12081 1726882416.55082: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000699 12081 1726882416.55085: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12081 1726882416.55139: no more pending results, returning what we have 12081 1726882416.55143: results queue empty 12081 1726882416.55144: checking for any_errors_fatal 12081 1726882416.55151: done checking for any_errors_fatal 12081 1726882416.55154: checking for max_fail_percentage 12081 1726882416.55156: done checking for max_fail_percentage 12081 1726882416.55156: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.55157: done checking to see if all hosts have failed 12081 1726882416.55158: getting the remaining hosts for this loop 12081 1726882416.55160: done getting the remaining hosts for this loop 12081 1726882416.55165: getting the next task for host managed_node3 12081 1726882416.55174: done getting next task for host managed_node3 12081 1726882416.55179: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882416.55184: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.55202: getting variables 12081 1726882416.55205: in VariableManager get_vars() 12081 1726882416.55240: Calling all_inventory to load vars for managed_node3 12081 1726882416.55242: Calling groups_inventory to load vars for managed_node3 12081 1726882416.55244: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.55256: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.55259: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.55261: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.56814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.59723: done with get_vars() 12081 1726882416.59776: done getting variables 12081 1726882416.59843: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:36 -0400 (0:00:00.101) 0:00:36.401 ****** 12081 1726882416.59899: entering _queue_task() for managed_node3/fail 12081 1726882416.60284: worker is 1 (out of 1 available) 12081 1726882416.60300: exiting _queue_task() for managed_node3/fail 12081 1726882416.60311: done queuing things up, now waiting for results queue to drain 12081 1726882416.60313: waiting for pending results... 12081 1726882416.60643: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882416.60838: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000069a 12081 1726882416.60861: variable 'ansible_search_path' from source: unknown 12081 1726882416.60875: variable 'ansible_search_path' from source: unknown 12081 1726882416.60920: calling self._execute() 12081 1726882416.61039: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.61062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.61080: variable 'omit' from source: magic vars 12081 1726882416.61512: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.61535: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.61695: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882416.61947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882416.64737: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882416.64836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882416.64896: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882416.64937: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882416.64983: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882416.65074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.65111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.65143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.65200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.65219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.65271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.65310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.65343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.65395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.65422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.65474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.65506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.65539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.65590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.65614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.65823: variable 'network_connections' from source: task vars 12081 1726882416.65842: variable 'port2_profile' from source: play vars 12081 1726882416.65919: variable 'port2_profile' from source: play vars 12081 1726882416.65938: variable 'port1_profile' from source: play vars 12081 1726882416.66016: variable 'port1_profile' from source: play vars 12081 1726882416.66029: variable 'controller_profile' from source: play vars 12081 1726882416.66108: variable 'controller_profile' from source: play vars 12081 1726882416.66197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882416.66409: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882416.66456: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882416.66505: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882416.66542: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882416.66601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882416.66635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882416.66672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.66716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882416.66782: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882416.67081: variable 'network_connections' from source: task vars 12081 1726882416.67092: variable 'port2_profile' from source: play vars 12081 1726882416.67176: variable 'port2_profile' from source: play vars 12081 1726882416.67193: variable 'port1_profile' from source: play vars 12081 1726882416.67267: variable 'port1_profile' from source: play vars 12081 1726882416.67282: variable 'controller_profile' from source: play vars 12081 1726882416.67351: variable 'controller_profile' from source: play vars 12081 1726882416.67390: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882416.67409: when evaluation is False, skipping this task 12081 1726882416.67415: _execute() done 12081 1726882416.67422: dumping result to json 12081 1726882416.67428: done dumping result, returning 12081 1726882416.67440: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-00000000069a] 12081 1726882416.67458: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069a skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882416.67633: no more pending results, returning what we have 12081 1726882416.67637: results queue empty 12081 1726882416.67638: checking for any_errors_fatal 12081 1726882416.67644: done checking for any_errors_fatal 12081 1726882416.67645: checking for max_fail_percentage 12081 1726882416.67647: done checking for max_fail_percentage 12081 1726882416.67648: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.67650: done checking to see if all hosts have failed 12081 1726882416.67650: getting the remaining hosts for this loop 12081 1726882416.67655: done getting the remaining hosts for this loop 12081 1726882416.67660: getting the next task for host managed_node3 12081 1726882416.67671: done getting next task for host managed_node3 12081 1726882416.67676: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12081 1726882416.67681: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.67699: getting variables 12081 1726882416.67701: in VariableManager get_vars() 12081 1726882416.67741: Calling all_inventory to load vars for managed_node3 12081 1726882416.67744: Calling groups_inventory to load vars for managed_node3 12081 1726882416.67747: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.67762: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.67767: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.67771: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.68424: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069a 12081 1726882416.68427: WORKER PROCESS EXITING 12081 1726882416.68825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.69836: done with get_vars() 12081 1726882416.69855: done getting variables 12081 1726882416.69899: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:36 -0400 (0:00:00.100) 0:00:36.502 ****** 12081 1726882416.69927: entering _queue_task() for managed_node3/package 12081 1726882416.70162: worker is 1 (out of 1 available) 12081 1726882416.70175: exiting _queue_task() for managed_node3/package 12081 1726882416.70188: done queuing things up, now waiting for results queue to drain 12081 1726882416.70190: waiting for pending results... 12081 1726882416.70376: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12081 1726882416.70480: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000069b 12081 1726882416.70496: variable 'ansible_search_path' from source: unknown 12081 1726882416.70501: variable 'ansible_search_path' from source: unknown 12081 1726882416.70525: calling self._execute() 12081 1726882416.70630: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.70633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.70644: variable 'omit' from source: magic vars 12081 1726882416.70973: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.70983: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.71594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882416.71598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882416.71601: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882416.71604: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882416.71606: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882416.71930: variable 'network_packages' from source: role '' defaults 12081 1726882416.71934: variable '__network_provider_setup' from source: role '' defaults 12081 1726882416.71939: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882416.71942: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882416.71944: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882416.71946: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882416.72075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882416.74086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882416.74131: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882416.74159: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882416.74184: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882416.74203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882416.74531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.74556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.74576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.74602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.74613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.74643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.74668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.74685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.74710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.74720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.74870: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882416.74940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.74959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.74982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.75007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.75017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.75083: variable 'ansible_python' from source: facts 12081 1726882416.75101: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882416.75153: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882416.75213: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882416.75295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.75315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.75334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.75361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.75373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.75404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.75427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.75445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.75474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.75484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.75581: variable 'network_connections' from source: task vars 12081 1726882416.75585: variable 'port2_profile' from source: play vars 12081 1726882416.75659: variable 'port2_profile' from source: play vars 12081 1726882416.75669: variable 'port1_profile' from source: play vars 12081 1726882416.75734: variable 'port1_profile' from source: play vars 12081 1726882416.75747: variable 'controller_profile' from source: play vars 12081 1726882416.75816: variable 'controller_profile' from source: play vars 12081 1726882416.75870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882416.75890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882416.75910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.75930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882416.75977: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882416.76184: variable 'network_connections' from source: task vars 12081 1726882416.76193: variable 'port2_profile' from source: play vars 12081 1726882416.76266: variable 'port2_profile' from source: play vars 12081 1726882416.76275: variable 'port1_profile' from source: play vars 12081 1726882416.76462: variable 'port1_profile' from source: play vars 12081 1726882416.76467: variable 'controller_profile' from source: play vars 12081 1726882416.76470: variable 'controller_profile' from source: play vars 12081 1726882416.76505: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882416.76579: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882416.76879: variable 'network_connections' from source: task vars 12081 1726882416.76882: variable 'port2_profile' from source: play vars 12081 1726882416.76943: variable 'port2_profile' from source: play vars 12081 1726882416.76950: variable 'port1_profile' from source: play vars 12081 1726882416.77013: variable 'port1_profile' from source: play vars 12081 1726882416.77020: variable 'controller_profile' from source: play vars 12081 1726882416.77088: variable 'controller_profile' from source: play vars 12081 1726882416.77110: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882416.77190: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882416.77495: variable 'network_connections' from source: task vars 12081 1726882416.77499: variable 'port2_profile' from source: play vars 12081 1726882416.77572: variable 'port2_profile' from source: play vars 12081 1726882416.77576: variable 'port1_profile' from source: play vars 12081 1726882416.77631: variable 'port1_profile' from source: play vars 12081 1726882416.77638: variable 'controller_profile' from source: play vars 12081 1726882416.77713: variable 'controller_profile' from source: play vars 12081 1726882416.77747: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882416.77795: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882416.77801: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882416.77842: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882416.77982: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882416.78293: variable 'network_connections' from source: task vars 12081 1726882416.78297: variable 'port2_profile' from source: play vars 12081 1726882416.78342: variable 'port2_profile' from source: play vars 12081 1726882416.78349: variable 'port1_profile' from source: play vars 12081 1726882416.78394: variable 'port1_profile' from source: play vars 12081 1726882416.78400: variable 'controller_profile' from source: play vars 12081 1726882416.78444: variable 'controller_profile' from source: play vars 12081 1726882416.78451: variable 'ansible_distribution' from source: facts 12081 1726882416.78457: variable '__network_rh_distros' from source: role '' defaults 12081 1726882416.78462: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.78476: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882416.78587: variable 'ansible_distribution' from source: facts 12081 1726882416.78590: variable '__network_rh_distros' from source: role '' defaults 12081 1726882416.78594: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.78605: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882416.78716: variable 'ansible_distribution' from source: facts 12081 1726882416.78720: variable '__network_rh_distros' from source: role '' defaults 12081 1726882416.78724: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.78750: variable 'network_provider' from source: set_fact 12081 1726882416.78768: variable 'ansible_facts' from source: unknown 12081 1726882416.79161: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12081 1726882416.79166: when evaluation is False, skipping this task 12081 1726882416.79168: _execute() done 12081 1726882416.79171: dumping result to json 12081 1726882416.79173: done dumping result, returning 12081 1726882416.79184: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-0a3f-ff3c-00000000069b] 12081 1726882416.79188: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069b 12081 1726882416.79278: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069b 12081 1726882416.79281: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12081 1726882416.79334: no more pending results, returning what we have 12081 1726882416.79337: results queue empty 12081 1726882416.79338: checking for any_errors_fatal 12081 1726882416.79347: done checking for any_errors_fatal 12081 1726882416.79347: checking for max_fail_percentage 12081 1726882416.79349: done checking for max_fail_percentage 12081 1726882416.79350: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.79351: done checking to see if all hosts have failed 12081 1726882416.79351: getting the remaining hosts for this loop 12081 1726882416.79353: done getting the remaining hosts for this loop 12081 1726882416.79369: getting the next task for host managed_node3 12081 1726882416.79378: done getting next task for host managed_node3 12081 1726882416.79382: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882416.79387: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.79404: getting variables 12081 1726882416.79406: in VariableManager get_vars() 12081 1726882416.79441: Calling all_inventory to load vars for managed_node3 12081 1726882416.79444: Calling groups_inventory to load vars for managed_node3 12081 1726882416.79445: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.79455: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.79458: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.79461: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.80486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.81968: done with get_vars() 12081 1726882416.81984: done getting variables 12081 1726882416.82027: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:36 -0400 (0:00:00.121) 0:00:36.623 ****** 12081 1726882416.82055: entering _queue_task() for managed_node3/package 12081 1726882416.82402: worker is 1 (out of 1 available) 12081 1726882416.82415: exiting _queue_task() for managed_node3/package 12081 1726882416.82428: done queuing things up, now waiting for results queue to drain 12081 1726882416.82430: waiting for pending results... 12081 1726882416.82772: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882416.82913: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000069c 12081 1726882416.82938: variable 'ansible_search_path' from source: unknown 12081 1726882416.82942: variable 'ansible_search_path' from source: unknown 12081 1726882416.82982: calling self._execute() 12081 1726882416.83090: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.83093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.83115: variable 'omit' from source: magic vars 12081 1726882416.83544: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.83572: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.83674: variable 'network_state' from source: role '' defaults 12081 1726882416.83685: Evaluated conditional (network_state != {}): False 12081 1726882416.83689: when evaluation is False, skipping this task 12081 1726882416.83691: _execute() done 12081 1726882416.83694: dumping result to json 12081 1726882416.83697: done dumping result, returning 12081 1726882416.83701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-00000000069c] 12081 1726882416.83708: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069c 12081 1726882416.83801: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069c 12081 1726882416.83803: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882416.83989: no more pending results, returning what we have 12081 1726882416.83996: results queue empty 12081 1726882416.83997: checking for any_errors_fatal 12081 1726882416.84006: done checking for any_errors_fatal 12081 1726882416.84007: checking for max_fail_percentage 12081 1726882416.84009: done checking for max_fail_percentage 12081 1726882416.84010: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.84011: done checking to see if all hosts have failed 12081 1726882416.84011: getting the remaining hosts for this loop 12081 1726882416.84015: done getting the remaining hosts for this loop 12081 1726882416.84021: getting the next task for host managed_node3 12081 1726882416.84029: done getting next task for host managed_node3 12081 1726882416.84033: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882416.84038: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.84058: getting variables 12081 1726882416.84060: in VariableManager get_vars() 12081 1726882416.84090: Calling all_inventory to load vars for managed_node3 12081 1726882416.84092: Calling groups_inventory to load vars for managed_node3 12081 1726882416.84093: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.84100: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.84102: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.84103: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.85331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.86434: done with get_vars() 12081 1726882416.86454: done getting variables 12081 1726882416.86496: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:36 -0400 (0:00:00.044) 0:00:36.668 ****** 12081 1726882416.86520: entering _queue_task() for managed_node3/package 12081 1726882416.86724: worker is 1 (out of 1 available) 12081 1726882416.86738: exiting _queue_task() for managed_node3/package 12081 1726882416.86750: done queuing things up, now waiting for results queue to drain 12081 1726882416.86754: waiting for pending results... 12081 1726882416.86940: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882416.87044: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000069d 12081 1726882416.87058: variable 'ansible_search_path' from source: unknown 12081 1726882416.87061: variable 'ansible_search_path' from source: unknown 12081 1726882416.87092: calling self._execute() 12081 1726882416.87165: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.87169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.87176: variable 'omit' from source: magic vars 12081 1726882416.87437: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.87446: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.87531: variable 'network_state' from source: role '' defaults 12081 1726882416.87542: Evaluated conditional (network_state != {}): False 12081 1726882416.87545: when evaluation is False, skipping this task 12081 1726882416.87548: _execute() done 12081 1726882416.87550: dumping result to json 12081 1726882416.87556: done dumping result, returning 12081 1726882416.87559: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-00000000069d] 12081 1726882416.87562: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069d 12081 1726882416.87656: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069d 12081 1726882416.87658: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882416.87712: no more pending results, returning what we have 12081 1726882416.87715: results queue empty 12081 1726882416.87716: checking for any_errors_fatal 12081 1726882416.87721: done checking for any_errors_fatal 12081 1726882416.87722: checking for max_fail_percentage 12081 1726882416.87723: done checking for max_fail_percentage 12081 1726882416.87724: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.87725: done checking to see if all hosts have failed 12081 1726882416.87726: getting the remaining hosts for this loop 12081 1726882416.87727: done getting the remaining hosts for this loop 12081 1726882416.87730: getting the next task for host managed_node3 12081 1726882416.87736: done getting next task for host managed_node3 12081 1726882416.87739: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882416.87744: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.87762: getting variables 12081 1726882416.87767: in VariableManager get_vars() 12081 1726882416.87803: Calling all_inventory to load vars for managed_node3 12081 1726882416.87805: Calling groups_inventory to load vars for managed_node3 12081 1726882416.87806: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.87813: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.87814: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.87816: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.88768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.89795: done with get_vars() 12081 1726882416.89820: done getting variables 12081 1726882416.89881: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:36 -0400 (0:00:00.034) 0:00:36.702 ****** 12081 1726882416.89932: entering _queue_task() for managed_node3/service 12081 1726882416.90275: worker is 1 (out of 1 available) 12081 1726882416.90295: exiting _queue_task() for managed_node3/service 12081 1726882416.90308: done queuing things up, now waiting for results queue to drain 12081 1726882416.90310: waiting for pending results... 12081 1726882416.90602: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882416.90714: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000069e 12081 1726882416.90725: variable 'ansible_search_path' from source: unknown 12081 1726882416.90729: variable 'ansible_search_path' from source: unknown 12081 1726882416.90762: calling self._execute() 12081 1726882416.90835: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.90839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.90847: variable 'omit' from source: magic vars 12081 1726882416.91253: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.91283: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.91400: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882416.91602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882416.93563: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882416.93618: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882416.93644: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882416.93674: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882416.93695: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882416.93754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.93779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.93796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.93824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.93834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.93870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.93887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.93903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.93931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.93942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.93973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882416.93990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882416.94006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.94030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882416.94041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882416.94179: variable 'network_connections' from source: task vars 12081 1726882416.94190: variable 'port2_profile' from source: play vars 12081 1726882416.94236: variable 'port2_profile' from source: play vars 12081 1726882416.94245: variable 'port1_profile' from source: play vars 12081 1726882416.94296: variable 'port1_profile' from source: play vars 12081 1726882416.94302: variable 'controller_profile' from source: play vars 12081 1726882416.94345: variable 'controller_profile' from source: play vars 12081 1726882416.94401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882416.94523: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882416.94550: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882416.94576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882416.94600: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882416.94630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882416.94646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882416.94669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882416.94688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882416.94729: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882416.94920: variable 'network_connections' from source: task vars 12081 1726882416.94923: variable 'port2_profile' from source: play vars 12081 1726882416.94976: variable 'port2_profile' from source: play vars 12081 1726882416.94982: variable 'port1_profile' from source: play vars 12081 1726882416.95041: variable 'port1_profile' from source: play vars 12081 1726882416.95049: variable 'controller_profile' from source: play vars 12081 1726882416.95109: variable 'controller_profile' from source: play vars 12081 1726882416.95136: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882416.95146: when evaluation is False, skipping this task 12081 1726882416.95149: _execute() done 12081 1726882416.95159: dumping result to json 12081 1726882416.95162: done dumping result, returning 12081 1726882416.95172: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-00000000069e] 12081 1726882416.95177: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069e 12081 1726882416.95291: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069e 12081 1726882416.95294: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882416.95336: no more pending results, returning what we have 12081 1726882416.95339: results queue empty 12081 1726882416.95340: checking for any_errors_fatal 12081 1726882416.95351: done checking for any_errors_fatal 12081 1726882416.95352: checking for max_fail_percentage 12081 1726882416.95354: done checking for max_fail_percentage 12081 1726882416.95355: checking to see if all hosts have failed and the running result is not ok 12081 1726882416.95356: done checking to see if all hosts have failed 12081 1726882416.95357: getting the remaining hosts for this loop 12081 1726882416.95359: done getting the remaining hosts for this loop 12081 1726882416.95362: getting the next task for host managed_node3 12081 1726882416.95391: done getting next task for host managed_node3 12081 1726882416.95396: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882416.95401: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882416.95440: getting variables 12081 1726882416.95442: in VariableManager get_vars() 12081 1726882416.95486: Calling all_inventory to load vars for managed_node3 12081 1726882416.95492: Calling groups_inventory to load vars for managed_node3 12081 1726882416.95494: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882416.95503: Calling all_plugins_play to load vars for managed_node3 12081 1726882416.95505: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882416.95508: Calling groups_plugins_play to load vars for managed_node3 12081 1726882416.96386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882416.97519: done with get_vars() 12081 1726882416.97541: done getting variables 12081 1726882416.97589: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:36 -0400 (0:00:00.076) 0:00:36.779 ****** 12081 1726882416.97613: entering _queue_task() for managed_node3/service 12081 1726882416.97833: worker is 1 (out of 1 available) 12081 1726882416.97845: exiting _queue_task() for managed_node3/service 12081 1726882416.97860: done queuing things up, now waiting for results queue to drain 12081 1726882416.97863: waiting for pending results... 12081 1726882416.98072: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882416.98208: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000069f 12081 1726882416.98212: variable 'ansible_search_path' from source: unknown 12081 1726882416.98215: variable 'ansible_search_path' from source: unknown 12081 1726882416.98234: calling self._execute() 12081 1726882416.98328: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882416.98331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882416.98340: variable 'omit' from source: magic vars 12081 1726882416.98694: variable 'ansible_distribution_major_version' from source: facts 12081 1726882416.98704: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882416.98831: variable 'network_provider' from source: set_fact 12081 1726882416.98835: variable 'network_state' from source: role '' defaults 12081 1726882416.98849: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12081 1726882416.98867: variable 'omit' from source: magic vars 12081 1726882416.98909: variable 'omit' from source: magic vars 12081 1726882416.98939: variable 'network_service_name' from source: role '' defaults 12081 1726882416.98997: variable 'network_service_name' from source: role '' defaults 12081 1726882416.99091: variable '__network_provider_setup' from source: role '' defaults 12081 1726882416.99095: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882416.99175: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882416.99178: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882416.99225: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882416.99385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882417.01596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882417.01646: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882417.01676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882417.01702: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882417.01721: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882417.01860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.01880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.01920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.01969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.01990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.02081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.02113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.02181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.02234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.02278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.02467: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882417.02628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.02648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.02686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.02719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.02728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.02794: variable 'ansible_python' from source: facts 12081 1726882417.02804: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882417.02865: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882417.02919: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882417.03015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.03030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.03166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.03169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.03172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.03174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.03183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.03191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.03230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.03246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.03378: variable 'network_connections' from source: task vars 12081 1726882417.03386: variable 'port2_profile' from source: play vars 12081 1726882417.03453: variable 'port2_profile' from source: play vars 12081 1726882417.03474: variable 'port1_profile' from source: play vars 12081 1726882417.03539: variable 'port1_profile' from source: play vars 12081 1726882417.03550: variable 'controller_profile' from source: play vars 12081 1726882417.03620: variable 'controller_profile' from source: play vars 12081 1726882417.03722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882417.03932: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882417.03982: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882417.04020: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882417.04060: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882417.04118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882417.04147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882417.04181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.04215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882417.04271: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882417.04538: variable 'network_connections' from source: task vars 12081 1726882417.04544: variable 'port2_profile' from source: play vars 12081 1726882417.04617: variable 'port2_profile' from source: play vars 12081 1726882417.04628: variable 'port1_profile' from source: play vars 12081 1726882417.04701: variable 'port1_profile' from source: play vars 12081 1726882417.04712: variable 'controller_profile' from source: play vars 12081 1726882417.04784: variable 'controller_profile' from source: play vars 12081 1726882417.04815: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882417.04899: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882417.05198: variable 'network_connections' from source: task vars 12081 1726882417.05201: variable 'port2_profile' from source: play vars 12081 1726882417.05270: variable 'port2_profile' from source: play vars 12081 1726882417.05278: variable 'port1_profile' from source: play vars 12081 1726882417.05351: variable 'port1_profile' from source: play vars 12081 1726882417.05354: variable 'controller_profile' from source: play vars 12081 1726882417.05418: variable 'controller_profile' from source: play vars 12081 1726882417.05436: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882417.05514: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882417.05697: variable 'network_connections' from source: task vars 12081 1726882417.05700: variable 'port2_profile' from source: play vars 12081 1726882417.05749: variable 'port2_profile' from source: play vars 12081 1726882417.05757: variable 'port1_profile' from source: play vars 12081 1726882417.05809: variable 'port1_profile' from source: play vars 12081 1726882417.05814: variable 'controller_profile' from source: play vars 12081 1726882417.05867: variable 'controller_profile' from source: play vars 12081 1726882417.05909: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882417.05954: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882417.05962: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882417.06025: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882417.06242: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882417.06559: variable 'network_connections' from source: task vars 12081 1726882417.06564: variable 'port2_profile' from source: play vars 12081 1726882417.06610: variable 'port2_profile' from source: play vars 12081 1726882417.06616: variable 'port1_profile' from source: play vars 12081 1726882417.06659: variable 'port1_profile' from source: play vars 12081 1726882417.06666: variable 'controller_profile' from source: play vars 12081 1726882417.06711: variable 'controller_profile' from source: play vars 12081 1726882417.06718: variable 'ansible_distribution' from source: facts 12081 1726882417.06720: variable '__network_rh_distros' from source: role '' defaults 12081 1726882417.06726: variable 'ansible_distribution_major_version' from source: facts 12081 1726882417.06737: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882417.06857: variable 'ansible_distribution' from source: facts 12081 1726882417.06860: variable '__network_rh_distros' from source: role '' defaults 12081 1726882417.06865: variable 'ansible_distribution_major_version' from source: facts 12081 1726882417.06876: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882417.06988: variable 'ansible_distribution' from source: facts 12081 1726882417.06992: variable '__network_rh_distros' from source: role '' defaults 12081 1726882417.06996: variable 'ansible_distribution_major_version' from source: facts 12081 1726882417.07025: variable 'network_provider' from source: set_fact 12081 1726882417.07042: variable 'omit' from source: magic vars 12081 1726882417.07066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882417.07087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882417.07101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882417.07113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882417.07123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882417.07147: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882417.07150: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882417.07153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882417.07222: Set connection var ansible_pipelining to False 12081 1726882417.07225: Set connection var ansible_shell_type to sh 12081 1726882417.07229: Set connection var ansible_shell_executable to /bin/sh 12081 1726882417.07232: Set connection var ansible_connection to ssh 12081 1726882417.07237: Set connection var ansible_timeout to 10 12081 1726882417.07243: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882417.07268: variable 'ansible_shell_executable' from source: unknown 12081 1726882417.07271: variable 'ansible_connection' from source: unknown 12081 1726882417.07273: variable 'ansible_module_compression' from source: unknown 12081 1726882417.07275: variable 'ansible_shell_type' from source: unknown 12081 1726882417.07277: variable 'ansible_shell_executable' from source: unknown 12081 1726882417.07279: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882417.07282: variable 'ansible_pipelining' from source: unknown 12081 1726882417.07285: variable 'ansible_timeout' from source: unknown 12081 1726882417.07289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882417.07362: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882417.07372: variable 'omit' from source: magic vars 12081 1726882417.07377: starting attempt loop 12081 1726882417.07379: running the handler 12081 1726882417.07431: variable 'ansible_facts' from source: unknown 12081 1726882417.08025: _low_level_execute_command(): starting 12081 1726882417.08036: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882417.10155: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882417.10176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.10198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.10218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.10261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.10277: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882417.10291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.10316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882417.10327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882417.10338: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882417.10351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.10368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.10384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.10397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.10410: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882417.10429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.10504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882417.10533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882417.10550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882417.10692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882417.12374: stdout chunk (state=3): >>>/root <<< 12081 1726882417.12479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882417.12578: stderr chunk (state=3): >>><<< 12081 1726882417.12581: stdout chunk (state=3): >>><<< 12081 1726882417.12970: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882417.12974: _low_level_execute_command(): starting 12081 1726882417.12978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837 `" && echo ansible-tmp-1726882417.128875-13769-275985885340837="` echo /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837 `" ) && sleep 0' 12081 1726882417.14697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882417.14712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.14734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.14760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.14809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.14822: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882417.14841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.14869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882417.14883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882417.14895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882417.14907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.14919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.14935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.14950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.14969: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882417.14988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.15069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882417.15101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882417.15120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882417.15255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882417.17136: stdout chunk (state=3): >>>ansible-tmp-1726882417.128875-13769-275985885340837=/root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837 <<< 12081 1726882417.17343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882417.17346: stdout chunk (state=3): >>><<< 12081 1726882417.17349: stderr chunk (state=3): >>><<< 12081 1726882417.17376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882417.128875-13769-275985885340837=/root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882417.17573: variable 'ansible_module_compression' from source: unknown 12081 1726882417.17576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12081 1726882417.17578: variable 'ansible_facts' from source: unknown 12081 1726882417.17733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837/AnsiballZ_systemd.py 12081 1726882417.18923: Sending initial data 12081 1726882417.18926: Sent initial data (155 bytes) 12081 1726882417.20721: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882417.20749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.20769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.20789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.20833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.20966: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882417.20988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.21006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882417.21019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882417.21030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882417.21043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.21064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.21087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.21100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.21110: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882417.21124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.21287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882417.21315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882417.21332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882417.21473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882417.23222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882417.23317: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882417.23419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpnirvfk6b /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837/AnsiballZ_systemd.py <<< 12081 1726882417.23513: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882417.26634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882417.26760: stderr chunk (state=3): >>><<< 12081 1726882417.26765: stdout chunk (state=3): >>><<< 12081 1726882417.26768: done transferring module to remote 12081 1726882417.26770: _low_level_execute_command(): starting 12081 1726882417.26773: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837/ /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837/AnsiballZ_systemd.py && sleep 0' 12081 1726882417.28403: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.28407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.28454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.28494: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.28560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882417.28569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882417.28573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882417.28704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882417.30470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882417.30504: stderr chunk (state=3): >>><<< 12081 1726882417.30507: stdout chunk (state=3): >>><<< 12081 1726882417.30524: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882417.30532: _low_level_execute_command(): starting 12081 1726882417.30535: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837/AnsiballZ_systemd.py && sleep 0' 12081 1726882417.32107: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882417.32116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.32126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.32139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.32179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.32186: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882417.32199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.32208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882417.32215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882417.32221: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882417.32229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.32238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.32250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.32260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882417.32271: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882417.32280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.32350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882417.32373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882417.32385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882417.32523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882417.57391: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13938688", "MemoryAvailable": "infinity", "CPUUsageNSec": "761112000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0"<<< 12081 1726882417.57396: stdout chunk (state=3): >>>, "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12081 1726882417.58874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882417.58878: stdout chunk (state=3): >>><<< 12081 1726882417.58883: stderr chunk (state=3): >>><<< 12081 1726882417.58900: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13938688", "MemoryAvailable": "infinity", "CPUUsageNSec": "761112000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882417.59084: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882417.59101: _low_level_execute_command(): starting 12081 1726882417.59107: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882417.128875-13769-275985885340837/ > /dev/null 2>&1 && sleep 0' 12081 1726882417.60107: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882417.60113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882417.60165: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.60171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882417.60184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882417.60190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882417.60269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882417.60282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882417.60288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882417.60411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882417.62294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882417.62297: stdout chunk (state=3): >>><<< 12081 1726882417.62300: stderr chunk (state=3): >>><<< 12081 1726882417.62576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882417.62579: handler run complete 12081 1726882417.62582: attempt loop complete, returning result 12081 1726882417.62584: _execute() done 12081 1726882417.62586: dumping result to json 12081 1726882417.62588: done dumping result, returning 12081 1726882417.62590: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-0a3f-ff3c-00000000069f] 12081 1726882417.62592: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069f 12081 1726882417.63402: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000069f 12081 1726882417.63405: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882417.63490: no more pending results, returning what we have 12081 1726882417.63493: results queue empty 12081 1726882417.63494: checking for any_errors_fatal 12081 1726882417.63528: done checking for any_errors_fatal 12081 1726882417.63530: checking for max_fail_percentage 12081 1726882417.63532: done checking for max_fail_percentage 12081 1726882417.63533: checking to see if all hosts have failed and the running result is not ok 12081 1726882417.63534: done checking to see if all hosts have failed 12081 1726882417.63534: getting the remaining hosts for this loop 12081 1726882417.63536: done getting the remaining hosts for this loop 12081 1726882417.63539: getting the next task for host managed_node3 12081 1726882417.63545: done getting next task for host managed_node3 12081 1726882417.63549: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882417.63555: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882417.63569: getting variables 12081 1726882417.63570: in VariableManager get_vars() 12081 1726882417.63598: Calling all_inventory to load vars for managed_node3 12081 1726882417.63601: Calling groups_inventory to load vars for managed_node3 12081 1726882417.63603: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882417.63612: Calling all_plugins_play to load vars for managed_node3 12081 1726882417.63615: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882417.63618: Calling groups_plugins_play to load vars for managed_node3 12081 1726882417.64981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882417.66801: done with get_vars() 12081 1726882417.66828: done getting variables 12081 1726882417.66899: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:37 -0400 (0:00:00.693) 0:00:37.472 ****** 12081 1726882417.66936: entering _queue_task() for managed_node3/service 12081 1726882417.67271: worker is 1 (out of 1 available) 12081 1726882417.67284: exiting _queue_task() for managed_node3/service 12081 1726882417.67297: done queuing things up, now waiting for results queue to drain 12081 1726882417.67298: waiting for pending results... 12081 1726882417.67604: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882417.67765: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a0 12081 1726882417.67787: variable 'ansible_search_path' from source: unknown 12081 1726882417.67791: variable 'ansible_search_path' from source: unknown 12081 1726882417.67828: calling self._execute() 12081 1726882417.67930: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882417.67934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882417.67943: variable 'omit' from source: magic vars 12081 1726882417.68332: variable 'ansible_distribution_major_version' from source: facts 12081 1726882417.68344: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882417.68468: variable 'network_provider' from source: set_fact 12081 1726882417.68475: Evaluated conditional (network_provider == "nm"): True 12081 1726882417.68573: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882417.68672: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882417.68862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882417.71190: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882417.71266: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882417.71301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882417.71347: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882417.71375: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882417.71623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.71662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.71690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.71729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.71748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.71802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.71824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.71849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.71903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.71917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.71958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.71992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.72017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.72058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.72072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.72244: variable 'network_connections' from source: task vars 12081 1726882417.72257: variable 'port2_profile' from source: play vars 12081 1726882417.72336: variable 'port2_profile' from source: play vars 12081 1726882417.72347: variable 'port1_profile' from source: play vars 12081 1726882417.72413: variable 'port1_profile' from source: play vars 12081 1726882417.72430: variable 'controller_profile' from source: play vars 12081 1726882417.72489: variable 'controller_profile' from source: play vars 12081 1726882417.72568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882417.72736: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882417.72779: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882417.72807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882417.72832: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882417.72884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882417.72902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882417.72924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.72951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882417.73008: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882417.73290: variable 'network_connections' from source: task vars 12081 1726882417.73298: variable 'port2_profile' from source: play vars 12081 1726882417.73358: variable 'port2_profile' from source: play vars 12081 1726882417.73365: variable 'port1_profile' from source: play vars 12081 1726882417.73428: variable 'port1_profile' from source: play vars 12081 1726882417.73440: variable 'controller_profile' from source: play vars 12081 1726882417.73496: variable 'controller_profile' from source: play vars 12081 1726882417.73534: Evaluated conditional (__network_wpa_supplicant_required): False 12081 1726882417.73538: when evaluation is False, skipping this task 12081 1726882417.73541: _execute() done 12081 1726882417.73544: dumping result to json 12081 1726882417.73546: done dumping result, returning 12081 1726882417.73555: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-0a3f-ff3c-0000000006a0] 12081 1726882417.73559: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a0 12081 1726882417.73660: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a0 12081 1726882417.73665: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12081 1726882417.73720: no more pending results, returning what we have 12081 1726882417.73725: results queue empty 12081 1726882417.73726: checking for any_errors_fatal 12081 1726882417.73748: done checking for any_errors_fatal 12081 1726882417.73749: checking for max_fail_percentage 12081 1726882417.73751: done checking for max_fail_percentage 12081 1726882417.73752: checking to see if all hosts have failed and the running result is not ok 12081 1726882417.73753: done checking to see if all hosts have failed 12081 1726882417.73754: getting the remaining hosts for this loop 12081 1726882417.73756: done getting the remaining hosts for this loop 12081 1726882417.73761: getting the next task for host managed_node3 12081 1726882417.73771: done getting next task for host managed_node3 12081 1726882417.73776: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882417.73782: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882417.73800: getting variables 12081 1726882417.73802: in VariableManager get_vars() 12081 1726882417.73841: Calling all_inventory to load vars for managed_node3 12081 1726882417.73844: Calling groups_inventory to load vars for managed_node3 12081 1726882417.73847: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882417.73859: Calling all_plugins_play to load vars for managed_node3 12081 1726882417.73863: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882417.73868: Calling groups_plugins_play to load vars for managed_node3 12081 1726882417.75732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882417.77578: done with get_vars() 12081 1726882417.77610: done getting variables 12081 1726882417.77685: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:37 -0400 (0:00:00.107) 0:00:37.580 ****** 12081 1726882417.77721: entering _queue_task() for managed_node3/service 12081 1726882417.78086: worker is 1 (out of 1 available) 12081 1726882417.78105: exiting _queue_task() for managed_node3/service 12081 1726882417.78119: done queuing things up, now waiting for results queue to drain 12081 1726882417.78120: waiting for pending results... 12081 1726882417.78424: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882417.78576: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a1 12081 1726882417.78589: variable 'ansible_search_path' from source: unknown 12081 1726882417.78593: variable 'ansible_search_path' from source: unknown 12081 1726882417.78629: calling self._execute() 12081 1726882417.78736: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882417.78739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882417.78756: variable 'omit' from source: magic vars 12081 1726882417.79148: variable 'ansible_distribution_major_version' from source: facts 12081 1726882417.79160: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882417.79295: variable 'network_provider' from source: set_fact 12081 1726882417.79306: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882417.79309: when evaluation is False, skipping this task 12081 1726882417.79312: _execute() done 12081 1726882417.79314: dumping result to json 12081 1726882417.79321: done dumping result, returning 12081 1726882417.79329: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-0a3f-ff3c-0000000006a1] 12081 1726882417.79337: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a1 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882417.79487: no more pending results, returning what we have 12081 1726882417.79491: results queue empty 12081 1726882417.79492: checking for any_errors_fatal 12081 1726882417.79505: done checking for any_errors_fatal 12081 1726882417.79506: checking for max_fail_percentage 12081 1726882417.79508: done checking for max_fail_percentage 12081 1726882417.79509: checking to see if all hosts have failed and the running result is not ok 12081 1726882417.79510: done checking to see if all hosts have failed 12081 1726882417.79512: getting the remaining hosts for this loop 12081 1726882417.79514: done getting the remaining hosts for this loop 12081 1726882417.79518: getting the next task for host managed_node3 12081 1726882417.79528: done getting next task for host managed_node3 12081 1726882417.79533: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882417.79539: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882417.79570: getting variables 12081 1726882417.79573: in VariableManager get_vars() 12081 1726882417.79613: Calling all_inventory to load vars for managed_node3 12081 1726882417.79616: Calling groups_inventory to load vars for managed_node3 12081 1726882417.79619: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882417.79634: Calling all_plugins_play to load vars for managed_node3 12081 1726882417.79638: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882417.79642: Calling groups_plugins_play to load vars for managed_node3 12081 1726882417.80471: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a1 12081 1726882417.80475: WORKER PROCESS EXITING 12081 1726882417.81556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882417.83298: done with get_vars() 12081 1726882417.83330: done getting variables 12081 1726882417.83397: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:37 -0400 (0:00:00.057) 0:00:37.637 ****** 12081 1726882417.83436: entering _queue_task() for managed_node3/copy 12081 1726882417.83786: worker is 1 (out of 1 available) 12081 1726882417.83798: exiting _queue_task() for managed_node3/copy 12081 1726882417.83810: done queuing things up, now waiting for results queue to drain 12081 1726882417.83811: waiting for pending results... 12081 1726882417.84106: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882417.84279: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a2 12081 1726882417.84295: variable 'ansible_search_path' from source: unknown 12081 1726882417.84301: variable 'ansible_search_path' from source: unknown 12081 1726882417.84340: calling self._execute() 12081 1726882417.84445: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882417.84459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882417.84479: variable 'omit' from source: magic vars 12081 1726882417.84861: variable 'ansible_distribution_major_version' from source: facts 12081 1726882417.84881: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882417.85005: variable 'network_provider' from source: set_fact 12081 1726882417.85021: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882417.85029: when evaluation is False, skipping this task 12081 1726882417.85035: _execute() done 12081 1726882417.85043: dumping result to json 12081 1726882417.85050: done dumping result, returning 12081 1726882417.85070: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-0a3f-ff3c-0000000006a2] 12081 1726882417.85084: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a2 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12081 1726882417.85247: no more pending results, returning what we have 12081 1726882417.85251: results queue empty 12081 1726882417.85255: checking for any_errors_fatal 12081 1726882417.85262: done checking for any_errors_fatal 12081 1726882417.85265: checking for max_fail_percentage 12081 1726882417.85267: done checking for max_fail_percentage 12081 1726882417.85268: checking to see if all hosts have failed and the running result is not ok 12081 1726882417.85270: done checking to see if all hosts have failed 12081 1726882417.85270: getting the remaining hosts for this loop 12081 1726882417.85272: done getting the remaining hosts for this loop 12081 1726882417.85277: getting the next task for host managed_node3 12081 1726882417.85286: done getting next task for host managed_node3 12081 1726882417.85290: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882417.85296: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882417.85316: getting variables 12081 1726882417.85319: in VariableManager get_vars() 12081 1726882417.85361: Calling all_inventory to load vars for managed_node3 12081 1726882417.85366: Calling groups_inventory to load vars for managed_node3 12081 1726882417.85369: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882417.85383: Calling all_plugins_play to load vars for managed_node3 12081 1726882417.85386: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882417.85389: Calling groups_plugins_play to load vars for managed_node3 12081 1726882417.86382: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a2 12081 1726882417.86385: WORKER PROCESS EXITING 12081 1726882417.87242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882417.88957: done with get_vars() 12081 1726882417.88987: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:37 -0400 (0:00:00.056) 0:00:37.693 ****** 12081 1726882417.89084: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882417.89401: worker is 1 (out of 1 available) 12081 1726882417.89413: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882417.89427: done queuing things up, now waiting for results queue to drain 12081 1726882417.89428: waiting for pending results... 12081 1726882417.89726: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882417.89890: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a3 12081 1726882417.89911: variable 'ansible_search_path' from source: unknown 12081 1726882417.89920: variable 'ansible_search_path' from source: unknown 12081 1726882417.89969: calling self._execute() 12081 1726882417.90074: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882417.90089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882417.90104: variable 'omit' from source: magic vars 12081 1726882417.90488: variable 'ansible_distribution_major_version' from source: facts 12081 1726882417.90507: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882417.90520: variable 'omit' from source: magic vars 12081 1726882417.90600: variable 'omit' from source: magic vars 12081 1726882417.90777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882417.93083: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882417.93165: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882417.93207: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882417.93250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882417.93288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882417.93383: variable 'network_provider' from source: set_fact 12081 1726882417.93525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882417.93567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882417.93599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882417.93647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882417.93677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882417.93758: variable 'omit' from source: magic vars 12081 1726882417.93879: variable 'omit' from source: magic vars 12081 1726882417.93992: variable 'network_connections' from source: task vars 12081 1726882417.94010: variable 'port2_profile' from source: play vars 12081 1726882417.94078: variable 'port2_profile' from source: play vars 12081 1726882417.94093: variable 'port1_profile' from source: play vars 12081 1726882417.94161: variable 'port1_profile' from source: play vars 12081 1726882417.94177: variable 'controller_profile' from source: play vars 12081 1726882417.94241: variable 'controller_profile' from source: play vars 12081 1726882417.94421: variable 'omit' from source: magic vars 12081 1726882417.94440: variable '__lsr_ansible_managed' from source: task vars 12081 1726882417.94508: variable '__lsr_ansible_managed' from source: task vars 12081 1726882417.94718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12081 1726882417.94935: Loaded config def from plugin (lookup/template) 12081 1726882417.94944: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12081 1726882417.94980: File lookup term: get_ansible_managed.j2 12081 1726882417.94987: variable 'ansible_search_path' from source: unknown 12081 1726882417.94996: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12081 1726882417.95010: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12081 1726882417.95029: variable 'ansible_search_path' from source: unknown 12081 1726882418.01783: variable 'ansible_managed' from source: unknown 12081 1726882418.01945: variable 'omit' from source: magic vars 12081 1726882418.01987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882418.02021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882418.02046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882418.02075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882418.02091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882418.02123: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882418.02137: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882418.02146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882418.02251: Set connection var ansible_pipelining to False 12081 1726882418.02263: Set connection var ansible_shell_type to sh 12081 1726882418.02279: Set connection var ansible_shell_executable to /bin/sh 12081 1726882418.02286: Set connection var ansible_connection to ssh 12081 1726882418.02297: Set connection var ansible_timeout to 10 12081 1726882418.02306: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882418.02334: variable 'ansible_shell_executable' from source: unknown 12081 1726882418.02343: variable 'ansible_connection' from source: unknown 12081 1726882418.02358: variable 'ansible_module_compression' from source: unknown 12081 1726882418.02368: variable 'ansible_shell_type' from source: unknown 12081 1726882418.02375: variable 'ansible_shell_executable' from source: unknown 12081 1726882418.02382: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882418.02389: variable 'ansible_pipelining' from source: unknown 12081 1726882418.02395: variable 'ansible_timeout' from source: unknown 12081 1726882418.02411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882418.02551: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882418.02574: variable 'omit' from source: magic vars 12081 1726882418.02585: starting attempt loop 12081 1726882418.02592: running the handler 12081 1726882418.02611: _low_level_execute_command(): starting 12081 1726882418.02623: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882418.03417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882418.03436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.03451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.03475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.03520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.03532: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882418.03546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.03567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882418.03580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882418.03591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882418.03603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.03616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.03630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.03640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.03649: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882418.03667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.03737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882418.03756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882418.03775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882418.03917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882418.05608: stdout chunk (state=3): >>>/root <<< 12081 1726882418.05801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882418.05805: stdout chunk (state=3): >>><<< 12081 1726882418.05807: stderr chunk (state=3): >>><<< 12081 1726882418.05918: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882418.05921: _low_level_execute_command(): starting 12081 1726882418.05925: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680 `" && echo ansible-tmp-1726882418.0582716-13816-190736274356680="` echo /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680 `" ) && sleep 0' 12081 1726882418.06524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882418.06536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.06548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.06572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.06620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.06633: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882418.06647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.06673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882418.06689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882418.06701: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882418.06714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.06727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.06743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.06757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.06772: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882418.06789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.06869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882418.06886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882418.06904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882418.07042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882418.08930: stdout chunk (state=3): >>>ansible-tmp-1726882418.0582716-13816-190736274356680=/root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680 <<< 12081 1726882418.09041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882418.09135: stderr chunk (state=3): >>><<< 12081 1726882418.09140: stdout chunk (state=3): >>><<< 12081 1726882418.09173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882418.0582716-13816-190736274356680=/root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882418.09222: variable 'ansible_module_compression' from source: unknown 12081 1726882418.09279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12081 1726882418.09310: variable 'ansible_facts' from source: unknown 12081 1726882418.09406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680/AnsiballZ_network_connections.py 12081 1726882418.09565: Sending initial data 12081 1726882418.09571: Sent initial data (168 bytes) 12081 1726882418.10623: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882418.10631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.10640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.10656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.10694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.10709: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882418.10718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.10730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882418.10737: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882418.10743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882418.10750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.10759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.10772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.10779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.10785: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882418.10793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.10872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882418.10886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882418.10892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882418.11022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882418.12757: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882418.12855: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882418.12958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp1cm9q58_ /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680/AnsiballZ_network_connections.py <<< 12081 1726882418.13062: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882418.15020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882418.15149: stderr chunk (state=3): >>><<< 12081 1726882418.15155: stdout chunk (state=3): >>><<< 12081 1726882418.15158: done transferring module to remote 12081 1726882418.15160: _low_level_execute_command(): starting 12081 1726882418.15163: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680/ /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680/AnsiballZ_network_connections.py && sleep 0' 12081 1726882418.15734: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882418.15747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.15766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.15784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.15826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.15836: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882418.15848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.15871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882418.15883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882418.15892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882418.15902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.15914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.15927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.15936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.15945: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882418.15960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.16030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882418.16045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882418.16060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882418.16194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882418.17928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882418.18018: stderr chunk (state=3): >>><<< 12081 1726882418.18021: stdout chunk (state=3): >>><<< 12081 1726882418.18045: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882418.18055: _low_level_execute_command(): starting 12081 1726882418.18058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680/AnsiballZ_network_connections.py && sleep 0' 12081 1726882418.18747: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882418.18757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.18769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.18786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.18827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.18834: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882418.18844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.18858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882418.18867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882418.18874: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882418.18882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.18890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.18906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.18914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.18920: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882418.18930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.19004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882418.19023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882418.19034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882418.19168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882418.71702: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 12081 1726882418.71722: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 12081 1726882418.71729: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/f1f3e927-6a4a-4c04-ae47-d49ad5d71408: error=unknown <<< 12081 1726882418.73994: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/ac020b9c-b6bb-4116-8ba6-0fb911e93ae5: error=unknown <<< 12081 1726882418.75738: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 12081 1726882418.75743: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/26d4f563-848c-497d-a265-cdc5607d566a: error=unknown <<< 12081 1726882418.76019: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12081 1726882418.77676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882418.77680: stdout chunk (state=3): >>><<< 12081 1726882418.77685: stderr chunk (state=3): >>><<< 12081 1726882418.77710: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/f1f3e927-6a4a-4c04-ae47-d49ad5d71408: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/ac020b9c-b6bb-4116-8ba6-0fb911e93ae5: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ld3f_nhv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/26d4f563-848c-497d-a265-cdc5607d566a: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882418.77758: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882418.77768: _low_level_execute_command(): starting 12081 1726882418.77771: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882418.0582716-13816-190736274356680/ > /dev/null 2>&1 && sleep 0' 12081 1726882418.78758: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882418.78766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.78778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.78792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.78831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.78847: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882418.78857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.78876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882418.78884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882418.78891: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882418.78899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882418.78908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882418.78920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882418.78927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882418.78934: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882418.78945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882418.79024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882418.79039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882418.79048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882418.79258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882418.81371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882418.81374: stdout chunk (state=3): >>><<< 12081 1726882418.81376: stderr chunk (state=3): >>><<< 12081 1726882418.81378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882418.81380: handler run complete 12081 1726882418.81382: attempt loop complete, returning result 12081 1726882418.81384: _execute() done 12081 1726882418.81385: dumping result to json 12081 1726882418.81387: done dumping result, returning 12081 1726882418.81388: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-0a3f-ff3c-0000000006a3] 12081 1726882418.81390: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a3 12081 1726882418.81473: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a3 12081 1726882418.81477: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12081 1726882418.81602: no more pending results, returning what we have 12081 1726882418.81606: results queue empty 12081 1726882418.81607: checking for any_errors_fatal 12081 1726882418.81614: done checking for any_errors_fatal 12081 1726882418.81615: checking for max_fail_percentage 12081 1726882418.81617: done checking for max_fail_percentage 12081 1726882418.81618: checking to see if all hosts have failed and the running result is not ok 12081 1726882418.81619: done checking to see if all hosts have failed 12081 1726882418.81620: getting the remaining hosts for this loop 12081 1726882418.81622: done getting the remaining hosts for this loop 12081 1726882418.81626: getting the next task for host managed_node3 12081 1726882418.81632: done getting next task for host managed_node3 12081 1726882418.81636: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882418.81641: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882418.81657: getting variables 12081 1726882418.81659: in VariableManager get_vars() 12081 1726882418.81697: Calling all_inventory to load vars for managed_node3 12081 1726882418.81700: Calling groups_inventory to load vars for managed_node3 12081 1726882418.81703: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882418.81714: Calling all_plugins_play to load vars for managed_node3 12081 1726882418.81717: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882418.81720: Calling groups_plugins_play to load vars for managed_node3 12081 1726882418.84876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882418.86805: done with get_vars() 12081 1726882418.86832: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:38 -0400 (0:00:00.978) 0:00:38.672 ****** 12081 1726882418.86931: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882418.87281: worker is 1 (out of 1 available) 12081 1726882418.87293: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882418.87307: done queuing things up, now waiting for results queue to drain 12081 1726882418.87309: waiting for pending results... 12081 1726882418.87619: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882418.87771: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a4 12081 1726882418.87784: variable 'ansible_search_path' from source: unknown 12081 1726882418.87791: variable 'ansible_search_path' from source: unknown 12081 1726882418.87825: calling self._execute() 12081 1726882418.87915: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882418.87918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882418.87930: variable 'omit' from source: magic vars 12081 1726882418.88322: variable 'ansible_distribution_major_version' from source: facts 12081 1726882418.88339: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882418.88470: variable 'network_state' from source: role '' defaults 12081 1726882418.88479: Evaluated conditional (network_state != {}): False 12081 1726882418.88485: when evaluation is False, skipping this task 12081 1726882418.88488: _execute() done 12081 1726882418.88495: dumping result to json 12081 1726882418.88514: done dumping result, returning 12081 1726882418.88521: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-0a3f-ff3c-0000000006a4] 12081 1726882418.88529: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a4 12081 1726882418.88633: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a4 12081 1726882418.88637: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882418.88695: no more pending results, returning what we have 12081 1726882418.88699: results queue empty 12081 1726882418.88700: checking for any_errors_fatal 12081 1726882418.88715: done checking for any_errors_fatal 12081 1726882418.88716: checking for max_fail_percentage 12081 1726882418.88718: done checking for max_fail_percentage 12081 1726882418.88719: checking to see if all hosts have failed and the running result is not ok 12081 1726882418.88720: done checking to see if all hosts have failed 12081 1726882418.88721: getting the remaining hosts for this loop 12081 1726882418.88723: done getting the remaining hosts for this loop 12081 1726882418.88727: getting the next task for host managed_node3 12081 1726882418.88735: done getting next task for host managed_node3 12081 1726882418.88739: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882418.88744: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882418.88766: getting variables 12081 1726882418.88772: in VariableManager get_vars() 12081 1726882418.88811: Calling all_inventory to load vars for managed_node3 12081 1726882418.88814: Calling groups_inventory to load vars for managed_node3 12081 1726882418.88817: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882418.88830: Calling all_plugins_play to load vars for managed_node3 12081 1726882418.88833: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882418.88836: Calling groups_plugins_play to load vars for managed_node3 12081 1726882418.91510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882418.94867: done with get_vars() 12081 1726882418.94898: done getting variables 12081 1726882418.94959: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:38 -0400 (0:00:00.080) 0:00:38.752 ****** 12081 1726882418.94998: entering _queue_task() for managed_node3/debug 12081 1726882418.95321: worker is 1 (out of 1 available) 12081 1726882418.95334: exiting _queue_task() for managed_node3/debug 12081 1726882418.95347: done queuing things up, now waiting for results queue to drain 12081 1726882418.95349: waiting for pending results... 12081 1726882418.95637: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882418.95776: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a5 12081 1726882418.95793: variable 'ansible_search_path' from source: unknown 12081 1726882418.95797: variable 'ansible_search_path' from source: unknown 12081 1726882418.95833: calling self._execute() 12081 1726882418.95923: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882418.95927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882418.95937: variable 'omit' from source: magic vars 12081 1726882418.96426: variable 'ansible_distribution_major_version' from source: facts 12081 1726882418.96539: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882418.96543: variable 'omit' from source: magic vars 12081 1726882418.96545: variable 'omit' from source: magic vars 12081 1726882418.96548: variable 'omit' from source: magic vars 12081 1726882418.96586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882418.96621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882418.96642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882418.96658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882418.96670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882418.96698: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882418.96701: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882418.96704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882418.96916: Set connection var ansible_pipelining to False 12081 1726882418.96919: Set connection var ansible_shell_type to sh 12081 1726882418.96925: Set connection var ansible_shell_executable to /bin/sh 12081 1726882418.96927: Set connection var ansible_connection to ssh 12081 1726882418.96933: Set connection var ansible_timeout to 10 12081 1726882418.96937: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882418.97080: variable 'ansible_shell_executable' from source: unknown 12081 1726882418.97084: variable 'ansible_connection' from source: unknown 12081 1726882418.97087: variable 'ansible_module_compression' from source: unknown 12081 1726882418.97090: variable 'ansible_shell_type' from source: unknown 12081 1726882418.97094: variable 'ansible_shell_executable' from source: unknown 12081 1726882418.97096: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882418.97098: variable 'ansible_pipelining' from source: unknown 12081 1726882418.97100: variable 'ansible_timeout' from source: unknown 12081 1726882418.97102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882418.97356: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882418.97362: variable 'omit' from source: magic vars 12081 1726882418.97369: starting attempt loop 12081 1726882418.97372: running the handler 12081 1726882418.97730: variable '__network_connections_result' from source: set_fact 12081 1726882418.97787: handler run complete 12081 1726882418.97809: attempt loop complete, returning result 12081 1726882418.97929: _execute() done 12081 1726882418.97932: dumping result to json 12081 1726882418.97935: done dumping result, returning 12081 1726882418.97945: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-0000000006a5] 12081 1726882418.97955: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a5 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 12081 1726882418.98111: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a5 12081 1726882418.98127: no more pending results, returning what we have 12081 1726882418.98131: results queue empty 12081 1726882418.98132: checking for any_errors_fatal 12081 1726882418.98141: done checking for any_errors_fatal 12081 1726882418.98142: checking for max_fail_percentage 12081 1726882418.98144: done checking for max_fail_percentage 12081 1726882418.98145: checking to see if all hosts have failed and the running result is not ok 12081 1726882418.98146: done checking to see if all hosts have failed 12081 1726882418.98147: getting the remaining hosts for this loop 12081 1726882418.98149: done getting the remaining hosts for this loop 12081 1726882418.98153: getting the next task for host managed_node3 12081 1726882418.98162: done getting next task for host managed_node3 12081 1726882418.98169: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882418.98175: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882418.98189: WORKER PROCESS EXITING 12081 1726882418.98195: getting variables 12081 1726882418.98197: in VariableManager get_vars() 12081 1726882418.98237: Calling all_inventory to load vars for managed_node3 12081 1726882418.98240: Calling groups_inventory to load vars for managed_node3 12081 1726882418.98242: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882418.98254: Calling all_plugins_play to load vars for managed_node3 12081 1726882418.98257: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882418.98260: Calling groups_plugins_play to load vars for managed_node3 12081 1726882419.00171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882419.02866: done with get_vars() 12081 1726882419.02895: done getting variables 12081 1726882419.02948: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:39 -0400 (0:00:00.079) 0:00:38.832 ****** 12081 1726882419.02990: entering _queue_task() for managed_node3/debug 12081 1726882419.03837: worker is 1 (out of 1 available) 12081 1726882419.03850: exiting _queue_task() for managed_node3/debug 12081 1726882419.03865: done queuing things up, now waiting for results queue to drain 12081 1726882419.03867: waiting for pending results... 12081 1726882419.04539: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882419.04703: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a6 12081 1726882419.04716: variable 'ansible_search_path' from source: unknown 12081 1726882419.04719: variable 'ansible_search_path' from source: unknown 12081 1726882419.04759: calling self._execute() 12081 1726882419.04855: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.04859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.04867: variable 'omit' from source: magic vars 12081 1726882419.05250: variable 'ansible_distribution_major_version' from source: facts 12081 1726882419.05283: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882419.05289: variable 'omit' from source: magic vars 12081 1726882419.05370: variable 'omit' from source: magic vars 12081 1726882419.05404: variable 'omit' from source: magic vars 12081 1726882419.05455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882419.05488: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882419.05508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882419.05525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882419.05536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882419.05572: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882419.05575: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.05578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.05681: Set connection var ansible_pipelining to False 12081 1726882419.05685: Set connection var ansible_shell_type to sh 12081 1726882419.05692: Set connection var ansible_shell_executable to /bin/sh 12081 1726882419.05694: Set connection var ansible_connection to ssh 12081 1726882419.05700: Set connection var ansible_timeout to 10 12081 1726882419.05706: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882419.05731: variable 'ansible_shell_executable' from source: unknown 12081 1726882419.05734: variable 'ansible_connection' from source: unknown 12081 1726882419.05737: variable 'ansible_module_compression' from source: unknown 12081 1726882419.05739: variable 'ansible_shell_type' from source: unknown 12081 1726882419.05742: variable 'ansible_shell_executable' from source: unknown 12081 1726882419.05746: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.05748: variable 'ansible_pipelining' from source: unknown 12081 1726882419.05750: variable 'ansible_timeout' from source: unknown 12081 1726882419.05756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.05902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882419.05912: variable 'omit' from source: magic vars 12081 1726882419.05917: starting attempt loop 12081 1726882419.05920: running the handler 12081 1726882419.05970: variable '__network_connections_result' from source: set_fact 12081 1726882419.06086: variable '__network_connections_result' from source: set_fact 12081 1726882419.06291: handler run complete 12081 1726882419.06348: attempt loop complete, returning result 12081 1726882419.06351: _execute() done 12081 1726882419.06357: dumping result to json 12081 1726882419.06359: done dumping result, returning 12081 1726882419.06368: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-0000000006a6] 12081 1726882419.06376: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a6 12081 1726882419.07060: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a6 12081 1726882419.07068: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12081 1726882419.07745: no more pending results, returning what we have 12081 1726882419.07749: results queue empty 12081 1726882419.07750: checking for any_errors_fatal 12081 1726882419.07765: done checking for any_errors_fatal 12081 1726882419.07766: checking for max_fail_percentage 12081 1726882419.07767: done checking for max_fail_percentage 12081 1726882419.07768: checking to see if all hosts have failed and the running result is not ok 12081 1726882419.07769: done checking to see if all hosts have failed 12081 1726882419.07770: getting the remaining hosts for this loop 12081 1726882419.07772: done getting the remaining hosts for this loop 12081 1726882419.07777: getting the next task for host managed_node3 12081 1726882419.07785: done getting next task for host managed_node3 12081 1726882419.07789: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882419.07794: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882419.07806: getting variables 12081 1726882419.07808: in VariableManager get_vars() 12081 1726882419.07846: Calling all_inventory to load vars for managed_node3 12081 1726882419.07857: Calling groups_inventory to load vars for managed_node3 12081 1726882419.07860: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882419.07882: Calling all_plugins_play to load vars for managed_node3 12081 1726882419.07886: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882419.07889: Calling groups_plugins_play to load vars for managed_node3 12081 1726882419.10406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882419.12394: done with get_vars() 12081 1726882419.12428: done getting variables 12081 1726882419.12496: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:39 -0400 (0:00:00.095) 0:00:38.928 ****** 12081 1726882419.12562: entering _queue_task() for managed_node3/debug 12081 1726882419.12918: worker is 1 (out of 1 available) 12081 1726882419.12931: exiting _queue_task() for managed_node3/debug 12081 1726882419.12942: done queuing things up, now waiting for results queue to drain 12081 1726882419.12943: waiting for pending results... 12081 1726882419.13253: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882419.13595: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a7 12081 1726882419.13616: variable 'ansible_search_path' from source: unknown 12081 1726882419.13620: variable 'ansible_search_path' from source: unknown 12081 1726882419.13648: calling self._execute() 12081 1726882419.13743: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.13747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.13765: variable 'omit' from source: magic vars 12081 1726882419.14493: variable 'ansible_distribution_major_version' from source: facts 12081 1726882419.14507: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882419.14744: variable 'network_state' from source: role '' defaults 12081 1726882419.14757: Evaluated conditional (network_state != {}): False 12081 1726882419.14761: when evaluation is False, skipping this task 12081 1726882419.14765: _execute() done 12081 1726882419.14815: dumping result to json 12081 1726882419.14819: done dumping result, returning 12081 1726882419.14828: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-0a3f-ff3c-0000000006a7] 12081 1726882419.14836: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a7 12081 1726882419.14936: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a7 12081 1726882419.14940: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12081 1726882419.14990: no more pending results, returning what we have 12081 1726882419.14995: results queue empty 12081 1726882419.14996: checking for any_errors_fatal 12081 1726882419.15011: done checking for any_errors_fatal 12081 1726882419.15012: checking for max_fail_percentage 12081 1726882419.15013: done checking for max_fail_percentage 12081 1726882419.15014: checking to see if all hosts have failed and the running result is not ok 12081 1726882419.15015: done checking to see if all hosts have failed 12081 1726882419.15016: getting the remaining hosts for this loop 12081 1726882419.15018: done getting the remaining hosts for this loop 12081 1726882419.15022: getting the next task for host managed_node3 12081 1726882419.15030: done getting next task for host managed_node3 12081 1726882419.15034: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882419.15040: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882419.15063: getting variables 12081 1726882419.15067: in VariableManager get_vars() 12081 1726882419.15104: Calling all_inventory to load vars for managed_node3 12081 1726882419.15106: Calling groups_inventory to load vars for managed_node3 12081 1726882419.15108: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882419.15121: Calling all_plugins_play to load vars for managed_node3 12081 1726882419.15124: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882419.15127: Calling groups_plugins_play to load vars for managed_node3 12081 1726882419.16875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882419.18575: done with get_vars() 12081 1726882419.18598: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:39 -0400 (0:00:00.061) 0:00:38.989 ****** 12081 1726882419.18694: entering _queue_task() for managed_node3/ping 12081 1726882419.19009: worker is 1 (out of 1 available) 12081 1726882419.19022: exiting _queue_task() for managed_node3/ping 12081 1726882419.19034: done queuing things up, now waiting for results queue to drain 12081 1726882419.19035: waiting for pending results... 12081 1726882419.19337: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882419.19672: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006a8 12081 1726882419.19676: variable 'ansible_search_path' from source: unknown 12081 1726882419.19679: variable 'ansible_search_path' from source: unknown 12081 1726882419.19682: calling self._execute() 12081 1726882419.19685: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.19687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.19689: variable 'omit' from source: magic vars 12081 1726882419.19988: variable 'ansible_distribution_major_version' from source: facts 12081 1726882419.20001: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882419.20007: variable 'omit' from source: magic vars 12081 1726882419.20079: variable 'omit' from source: magic vars 12081 1726882419.20107: variable 'omit' from source: magic vars 12081 1726882419.20149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882419.20182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882419.20199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882419.20215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882419.20225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882419.20255: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882419.20261: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.20266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.20362: Set connection var ansible_pipelining to False 12081 1726882419.20368: Set connection var ansible_shell_type to sh 12081 1726882419.20375: Set connection var ansible_shell_executable to /bin/sh 12081 1726882419.20377: Set connection var ansible_connection to ssh 12081 1726882419.20383: Set connection var ansible_timeout to 10 12081 1726882419.20388: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882419.20415: variable 'ansible_shell_executable' from source: unknown 12081 1726882419.20418: variable 'ansible_connection' from source: unknown 12081 1726882419.20421: variable 'ansible_module_compression' from source: unknown 12081 1726882419.20423: variable 'ansible_shell_type' from source: unknown 12081 1726882419.20426: variable 'ansible_shell_executable' from source: unknown 12081 1726882419.20429: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.20432: variable 'ansible_pipelining' from source: unknown 12081 1726882419.20434: variable 'ansible_timeout' from source: unknown 12081 1726882419.20436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.20672: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882419.20676: variable 'omit' from source: magic vars 12081 1726882419.20679: starting attempt loop 12081 1726882419.20682: running the handler 12081 1726882419.20696: _low_level_execute_command(): starting 12081 1726882419.20703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882419.21499: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.21511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.21522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.21538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.21589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.21597: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.21607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.21620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.21629: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.21635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.21643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.21657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.21675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.21683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.21691: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.21701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.21778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.21796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.21803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.21943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.23639: stdout chunk (state=3): >>>/root <<< 12081 1726882419.23752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.23835: stderr chunk (state=3): >>><<< 12081 1726882419.23840: stdout chunk (state=3): >>><<< 12081 1726882419.23881: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882419.23895: _low_level_execute_command(): starting 12081 1726882419.23902: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379 `" && echo ansible-tmp-1726882419.2388153-13885-64753634753379="` echo /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379 `" ) && sleep 0' 12081 1726882419.24608: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.24617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.24627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.24645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.24690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.24697: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.24706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.24721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.24726: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.24733: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.24742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.24757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.24774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.24782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.24789: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.24798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.24881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.24900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.24912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.25037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.26917: stdout chunk (state=3): >>>ansible-tmp-1726882419.2388153-13885-64753634753379=/root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379 <<< 12081 1726882419.27099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.27103: stdout chunk (state=3): >>><<< 12081 1726882419.27109: stderr chunk (state=3): >>><<< 12081 1726882419.27128: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882419.2388153-13885-64753634753379=/root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882419.27178: variable 'ansible_module_compression' from source: unknown 12081 1726882419.27217: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12081 1726882419.27250: variable 'ansible_facts' from source: unknown 12081 1726882419.27330: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379/AnsiballZ_ping.py 12081 1726882419.28320: Sending initial data 12081 1726882419.28324: Sent initial data (152 bytes) 12081 1726882419.29296: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.29486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.29497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.29512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.30270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.30274: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.30276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.30279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.30281: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.30283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.30286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.30288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.30290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.30292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.30294: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.30376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.30471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.30475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.30479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.30550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.32325: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882419.32417: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882419.32520: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpj1ll62e7 /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379/AnsiballZ_ping.py <<< 12081 1726882419.32615: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882419.33977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.34068: stderr chunk (state=3): >>><<< 12081 1726882419.34071: stdout chunk (state=3): >>><<< 12081 1726882419.34096: done transferring module to remote 12081 1726882419.34107: _low_level_execute_command(): starting 12081 1726882419.34114: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379/ /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379/AnsiballZ_ping.py && sleep 0' 12081 1726882419.34789: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.34798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.34809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.34822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.34872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.34879: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.34889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.34901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.34908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.34915: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.34922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.34930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.34949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.34959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.34968: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.34978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.35059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.35083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.35093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.35214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.36984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.37045: stderr chunk (state=3): >>><<< 12081 1726882419.37048: stdout chunk (state=3): >>><<< 12081 1726882419.37071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882419.37074: _low_level_execute_command(): starting 12081 1726882419.37082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379/AnsiballZ_ping.py && sleep 0' 12081 1726882419.37778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.37787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.37797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.37818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.37855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.37867: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.37879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.37892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.37900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.37906: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.37915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.37929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.37940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.37946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.37952: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.37966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.38055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.38059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.38062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.38202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.51115: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12081 1726882419.52085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882419.52089: stdout chunk (state=3): >>><<< 12081 1726882419.52095: stderr chunk (state=3): >>><<< 12081 1726882419.52114: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882419.52139: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882419.52146: _low_level_execute_command(): starting 12081 1726882419.52151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882419.2388153-13885-64753634753379/ > /dev/null 2>&1 && sleep 0' 12081 1726882419.52842: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.52850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.52861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.52879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.52923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.52931: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.52940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.52957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.52960: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.52969: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.52977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.52986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.53005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.53012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.53019: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.53028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.53101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.53126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.53138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.53274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.55077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.55169: stderr chunk (state=3): >>><<< 12081 1726882419.55181: stdout chunk (state=3): >>><<< 12081 1726882419.55602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882419.55605: handler run complete 12081 1726882419.55608: attempt loop complete, returning result 12081 1726882419.55610: _execute() done 12081 1726882419.55612: dumping result to json 12081 1726882419.55614: done dumping result, returning 12081 1726882419.55616: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-0a3f-ff3c-0000000006a8] 12081 1726882419.55618: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a8 12081 1726882419.55697: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006a8 12081 1726882419.55702: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 12081 1726882419.55757: no more pending results, returning what we have 12081 1726882419.55761: results queue empty 12081 1726882419.55761: checking for any_errors_fatal 12081 1726882419.55769: done checking for any_errors_fatal 12081 1726882419.55769: checking for max_fail_percentage 12081 1726882419.55771: done checking for max_fail_percentage 12081 1726882419.55772: checking to see if all hosts have failed and the running result is not ok 12081 1726882419.55772: done checking to see if all hosts have failed 12081 1726882419.55773: getting the remaining hosts for this loop 12081 1726882419.55774: done getting the remaining hosts for this loop 12081 1726882419.55778: getting the next task for host managed_node3 12081 1726882419.55787: done getting next task for host managed_node3 12081 1726882419.55789: ^ task is: TASK: meta (role_complete) 12081 1726882419.55793: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882419.55802: getting variables 12081 1726882419.55804: in VariableManager get_vars() 12081 1726882419.55838: Calling all_inventory to load vars for managed_node3 12081 1726882419.55840: Calling groups_inventory to load vars for managed_node3 12081 1726882419.55842: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882419.55850: Calling all_plugins_play to load vars for managed_node3 12081 1726882419.55852: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882419.55855: Calling groups_plugins_play to load vars for managed_node3 12081 1726882419.57295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882419.64392: done with get_vars() 12081 1726882419.64419: done getting variables 12081 1726882419.64492: done queuing things up, now waiting for results queue to drain 12081 1726882419.64494: results queue empty 12081 1726882419.64495: checking for any_errors_fatal 12081 1726882419.64498: done checking for any_errors_fatal 12081 1726882419.64499: checking for max_fail_percentage 12081 1726882419.64500: done checking for max_fail_percentage 12081 1726882419.64501: checking to see if all hosts have failed and the running result is not ok 12081 1726882419.64501: done checking to see if all hosts have failed 12081 1726882419.64502: getting the remaining hosts for this loop 12081 1726882419.64503: done getting the remaining hosts for this loop 12081 1726882419.64510: getting the next task for host managed_node3 12081 1726882419.64514: done getting next task for host managed_node3 12081 1726882419.64516: ^ task is: TASK: Delete the device '{{ controller_device }}' 12081 1726882419.64519: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882419.64521: getting variables 12081 1726882419.64522: in VariableManager get_vars() 12081 1726882419.64532: Calling all_inventory to load vars for managed_node3 12081 1726882419.64534: Calling groups_inventory to load vars for managed_node3 12081 1726882419.64536: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882419.64541: Calling all_plugins_play to load vars for managed_node3 12081 1726882419.64544: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882419.64546: Calling groups_plugins_play to load vars for managed_node3 12081 1726882419.65783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882419.67484: done with get_vars() 12081 1726882419.67512: done getting variables 12081 1726882419.67561: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882419.67681: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 21:33:39 -0400 (0:00:00.490) 0:00:39.480 ****** 12081 1726882419.67716: entering _queue_task() for managed_node3/command 12081 1726882419.68077: worker is 1 (out of 1 available) 12081 1726882419.68091: exiting _queue_task() for managed_node3/command 12081 1726882419.68103: done queuing things up, now waiting for results queue to drain 12081 1726882419.68105: waiting for pending results... 12081 1726882419.68423: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 12081 1726882419.68543: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006d8 12081 1726882419.68565: variable 'ansible_search_path' from source: unknown 12081 1726882419.68571: variable 'ansible_search_path' from source: unknown 12081 1726882419.68611: calling self._execute() 12081 1726882419.68721: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.68725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.68734: variable 'omit' from source: magic vars 12081 1726882419.69137: variable 'ansible_distribution_major_version' from source: facts 12081 1726882419.69150: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882419.69156: variable 'omit' from source: magic vars 12081 1726882419.69181: variable 'omit' from source: magic vars 12081 1726882419.69279: variable 'controller_device' from source: play vars 12081 1726882419.69297: variable 'omit' from source: magic vars 12081 1726882419.69348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882419.69385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882419.69404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882419.69428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882419.69444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882419.69475: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882419.69478: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.69482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.69590: Set connection var ansible_pipelining to False 12081 1726882419.69593: Set connection var ansible_shell_type to sh 12081 1726882419.69600: Set connection var ansible_shell_executable to /bin/sh 12081 1726882419.69602: Set connection var ansible_connection to ssh 12081 1726882419.69608: Set connection var ansible_timeout to 10 12081 1726882419.69613: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882419.69642: variable 'ansible_shell_executable' from source: unknown 12081 1726882419.69647: variable 'ansible_connection' from source: unknown 12081 1726882419.69656: variable 'ansible_module_compression' from source: unknown 12081 1726882419.69659: variable 'ansible_shell_type' from source: unknown 12081 1726882419.69662: variable 'ansible_shell_executable' from source: unknown 12081 1726882419.69666: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882419.69668: variable 'ansible_pipelining' from source: unknown 12081 1726882419.69671: variable 'ansible_timeout' from source: unknown 12081 1726882419.69674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882419.69822: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882419.69833: variable 'omit' from source: magic vars 12081 1726882419.69838: starting attempt loop 12081 1726882419.69841: running the handler 12081 1726882419.69867: _low_level_execute_command(): starting 12081 1726882419.69878: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882419.70627: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.70637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.70657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.70675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.70713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.70719: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.70729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.70743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.70751: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.70762: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.70773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.70786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.70797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.70805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.70812: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.70820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.70901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.70917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.70925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.71057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.72710: stdout chunk (state=3): >>>/root <<< 12081 1726882419.72801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.72900: stderr chunk (state=3): >>><<< 12081 1726882419.72914: stdout chunk (state=3): >>><<< 12081 1726882419.73058: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882419.73062: _low_level_execute_command(): starting 12081 1726882419.73067: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925 `" && echo ansible-tmp-1726882419.7295444-13919-29885459065925="` echo /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925 `" ) && sleep 0' 12081 1726882419.73726: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.73745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.73766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.73786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.73836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.73856: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.73877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.73896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.73908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.73921: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.73938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.73960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.73980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.73994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.74006: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.74020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.74109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.74133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.74159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.74299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.76167: stdout chunk (state=3): >>>ansible-tmp-1726882419.7295444-13919-29885459065925=/root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925 <<< 12081 1726882419.76289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.76358: stderr chunk (state=3): >>><<< 12081 1726882419.76361: stdout chunk (state=3): >>><<< 12081 1726882419.76381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882419.7295444-13919-29885459065925=/root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882419.76413: variable 'ansible_module_compression' from source: unknown 12081 1726882419.76473: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882419.76508: variable 'ansible_facts' from source: unknown 12081 1726882419.76596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925/AnsiballZ_command.py 12081 1726882419.76746: Sending initial data 12081 1726882419.76750: Sent initial data (155 bytes) 12081 1726882419.77724: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.77733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.77743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.77758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.77802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.77810: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.77820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.77833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.77842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.77848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.77857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.77867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.77883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.77891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.77898: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.77907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.77981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.77997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.78007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.78138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.79934: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882419.80226: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882419.80230: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpuaj6024l /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925/AnsiballZ_command.py <<< 12081 1726882419.80232: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882419.81638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.81758: stderr chunk (state=3): >>><<< 12081 1726882419.81761: stdout chunk (state=3): >>><<< 12081 1726882419.81765: done transferring module to remote 12081 1726882419.81772: _low_level_execute_command(): starting 12081 1726882419.81775: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925/ /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925/AnsiballZ_command.py && sleep 0' 12081 1726882419.82340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.82358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.82377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.82396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.82439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.82456: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.82475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.82494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.82506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.82518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.82531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.82544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.82564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.82581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.82593: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.82608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.82693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.82710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.82725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.82860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.84643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882419.84647: stdout chunk (state=3): >>><<< 12081 1726882419.84653: stderr chunk (state=3): >>><<< 12081 1726882419.84676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882419.84680: _low_level_execute_command(): starting 12081 1726882419.84685: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925/AnsiballZ_command.py && sleep 0' 12081 1726882419.85319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882419.85323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.85326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.85339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.85382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.85390: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882419.85399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.85412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882419.85420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882419.85426: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882419.85434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882419.85443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882419.85454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882419.85468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882419.85477: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882419.85488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882419.85555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882419.85576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882419.85587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882419.85729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882419.99499: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:33:39.986345", "end": "2024-09-20 21:33:39.993456", "delta": "0:00:00.007111", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882420.00610: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.105 closed. <<< 12081 1726882420.00614: stdout chunk (state=3): >>><<< 12081 1726882420.00619: stderr chunk (state=3): >>><<< 12081 1726882420.00642: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:33:39.986345", "end": "2024-09-20 21:33:39.993456", "delta": "0:00:00.007111", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.105 closed. 12081 1726882420.00689: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882420.00696: _low_level_execute_command(): starting 12081 1726882420.00701: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882419.7295444-13919-29885459065925/ > /dev/null 2>&1 && sleep 0' 12081 1726882420.01333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.01341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.01351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.01372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.01411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.01418: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.01429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.01441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.01451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.01453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.01468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.01478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.01490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.01498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.01504: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.01513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.01588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.01603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.01607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.01747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.03570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.03654: stderr chunk (state=3): >>><<< 12081 1726882420.03672: stdout chunk (state=3): >>><<< 12081 1726882420.03773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.03777: handler run complete 12081 1726882420.03779: Evaluated conditional (False): False 12081 1726882420.03782: Evaluated conditional (False): False 12081 1726882420.03784: attempt loop complete, returning result 12081 1726882420.03786: _execute() done 12081 1726882420.03788: dumping result to json 12081 1726882420.03790: done dumping result, returning 12081 1726882420.03792: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0e448fcc-3ce9-0a3f-ff3c-0000000006d8] 12081 1726882420.03971: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006d8 12081 1726882420.04048: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006d8 12081 1726882420.04051: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007111", "end": "2024-09-20 21:33:39.993456", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:33:39.986345" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 12081 1726882420.04151: no more pending results, returning what we have 12081 1726882420.04155: results queue empty 12081 1726882420.04157: checking for any_errors_fatal 12081 1726882420.04158: done checking for any_errors_fatal 12081 1726882420.04159: checking for max_fail_percentage 12081 1726882420.04161: done checking for max_fail_percentage 12081 1726882420.04162: checking to see if all hosts have failed and the running result is not ok 12081 1726882420.04166: done checking to see if all hosts have failed 12081 1726882420.04167: getting the remaining hosts for this loop 12081 1726882420.04169: done getting the remaining hosts for this loop 12081 1726882420.04173: getting the next task for host managed_node3 12081 1726882420.04186: done getting next task for host managed_node3 12081 1726882420.04189: ^ task is: TASK: Remove test interfaces 12081 1726882420.04194: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882420.04200: getting variables 12081 1726882420.04203: in VariableManager get_vars() 12081 1726882420.04243: Calling all_inventory to load vars for managed_node3 12081 1726882420.04246: Calling groups_inventory to load vars for managed_node3 12081 1726882420.04249: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882420.04266: Calling all_plugins_play to load vars for managed_node3 12081 1726882420.04269: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882420.04273: Calling groups_plugins_play to load vars for managed_node3 12081 1726882420.06044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882420.07669: done with get_vars() 12081 1726882420.07693: done getting variables 12081 1726882420.07753: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:40 -0400 (0:00:00.400) 0:00:39.880 ****** 12081 1726882420.07790: entering _queue_task() for managed_node3/shell 12081 1726882420.08113: worker is 1 (out of 1 available) 12081 1726882420.08125: exiting _queue_task() for managed_node3/shell 12081 1726882420.08138: done queuing things up, now waiting for results queue to drain 12081 1726882420.08139: waiting for pending results... 12081 1726882420.08436: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 12081 1726882420.08555: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006de 12081 1726882420.08581: variable 'ansible_search_path' from source: unknown 12081 1726882420.08588: variable 'ansible_search_path' from source: unknown 12081 1726882420.08627: calling self._execute() 12081 1726882420.08732: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882420.08745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882420.08757: variable 'omit' from source: magic vars 12081 1726882420.09134: variable 'ansible_distribution_major_version' from source: facts 12081 1726882420.09151: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882420.09165: variable 'omit' from source: magic vars 12081 1726882420.09215: variable 'omit' from source: magic vars 12081 1726882420.09387: variable 'dhcp_interface1' from source: play vars 12081 1726882420.09397: variable 'dhcp_interface2' from source: play vars 12081 1726882420.09422: variable 'omit' from source: magic vars 12081 1726882420.09476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882420.09514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882420.09537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882420.09565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882420.09583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882420.09616: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882420.09625: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882420.09632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882420.09739: Set connection var ansible_pipelining to False 12081 1726882420.09746: Set connection var ansible_shell_type to sh 12081 1726882420.09758: Set connection var ansible_shell_executable to /bin/sh 12081 1726882420.09770: Set connection var ansible_connection to ssh 12081 1726882420.09780: Set connection var ansible_timeout to 10 12081 1726882420.09788: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882420.09817: variable 'ansible_shell_executable' from source: unknown 12081 1726882420.09825: variable 'ansible_connection' from source: unknown 12081 1726882420.09832: variable 'ansible_module_compression' from source: unknown 12081 1726882420.09839: variable 'ansible_shell_type' from source: unknown 12081 1726882420.09846: variable 'ansible_shell_executable' from source: unknown 12081 1726882420.09852: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882420.09860: variable 'ansible_pipelining' from source: unknown 12081 1726882420.09871: variable 'ansible_timeout' from source: unknown 12081 1726882420.09884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882420.10022: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882420.10039: variable 'omit' from source: magic vars 12081 1726882420.10049: starting attempt loop 12081 1726882420.10056: running the handler 12081 1726882420.10072: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882420.10103: _low_level_execute_command(): starting 12081 1726882420.10115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882420.10886: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.10900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.10914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.10933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.10978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.10991: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.11008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.11027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.11039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.11051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.11067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.11082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.11099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.11118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.11131: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.11143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.11214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.11231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.11245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.11390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.13025: stdout chunk (state=3): >>>/root <<< 12081 1726882420.13130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.13205: stderr chunk (state=3): >>><<< 12081 1726882420.13208: stdout chunk (state=3): >>><<< 12081 1726882420.13307: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.13318: _low_level_execute_command(): starting 12081 1726882420.13321: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850 `" && echo ansible-tmp-1726882420.132266-13936-210386224324850="` echo /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850 `" ) && sleep 0' 12081 1726882420.13880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.13896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.13912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.13931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.13977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.13992: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.14007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.14024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.14036: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.14047: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.14059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.14077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.14098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.14111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.14122: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.14136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.14215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.14232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.14247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.14391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.16279: stdout chunk (state=3): >>>ansible-tmp-1726882420.132266-13936-210386224324850=/root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850 <<< 12081 1726882420.16487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.16490: stdout chunk (state=3): >>><<< 12081 1726882420.16493: stderr chunk (state=3): >>><<< 12081 1726882420.16811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882420.132266-13936-210386224324850=/root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.16815: variable 'ansible_module_compression' from source: unknown 12081 1726882420.16817: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882420.16820: variable 'ansible_facts' from source: unknown 12081 1726882420.16822: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850/AnsiballZ_command.py 12081 1726882420.16887: Sending initial data 12081 1726882420.16890: Sent initial data (155 bytes) 12081 1726882420.17891: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.17908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.17924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.17943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.17990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.18008: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.18022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.18041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.18054: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.18069: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.18082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.18097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.18118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.18133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.18145: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.18159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.18238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.18258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.18276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.18400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.20148: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882420.20247: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882420.20350: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpor0rm7pr /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850/AnsiballZ_command.py <<< 12081 1726882420.20444: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882420.22188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.22309: stderr chunk (state=3): >>><<< 12081 1726882420.22313: stdout chunk (state=3): >>><<< 12081 1726882420.22316: done transferring module to remote 12081 1726882420.22318: _low_level_execute_command(): starting 12081 1726882420.22320: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850/ /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850/AnsiballZ_command.py && sleep 0' 12081 1726882420.23024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.23037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.23051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.23083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.23124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.23136: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.23150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.23172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.23189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.23200: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.23212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.23224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.23239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.23250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.23262: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.23279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.23356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.23381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.23402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.23530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.25428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.25432: stdout chunk (state=3): >>><<< 12081 1726882420.25434: stderr chunk (state=3): >>><<< 12081 1726882420.25523: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.25526: _low_level_execute_command(): starting 12081 1726882420.25529: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850/AnsiballZ_command.py && sleep 0' 12081 1726882420.26942: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.27383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.27401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.27421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.27471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.27487: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.27502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.27520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.27533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.27544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.27557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.27574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.27591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.27604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.27617: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.27630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.27708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.27732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.27748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.27893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.46644: stdout chunk (state=3): >>> <<< 12081 1726882420.46888: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:40.407730", "end": "2024-09-20 21:33:40.464595", "delta": "0:00:00.056865", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882420.48537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882420.48541: stdout chunk (state=3): >>><<< 12081 1726882420.48544: stderr chunk (state=3): >>><<< 12081 1726882420.48709: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:40.407730", "end": "2024-09-20 21:33:40.464595", "delta": "0:00:00.056865", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882420.48713: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882420.48716: _low_level_execute_command(): starting 12081 1726882420.48718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882420.132266-13936-210386224324850/ > /dev/null 2>&1 && sleep 0' 12081 1726882420.50130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.50134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.50151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882420.50281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.50284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.50287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.50369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.50373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.50387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.50599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.52425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.52503: stderr chunk (state=3): >>><<< 12081 1726882420.52506: stdout chunk (state=3): >>><<< 12081 1726882420.52774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.52778: handler run complete 12081 1726882420.52780: Evaluated conditional (False): False 12081 1726882420.52782: attempt loop complete, returning result 12081 1726882420.52784: _execute() done 12081 1726882420.52786: dumping result to json 12081 1726882420.52788: done dumping result, returning 12081 1726882420.52790: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0e448fcc-3ce9-0a3f-ff3c-0000000006de] 12081 1726882420.52792: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006de 12081 1726882420.52885: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006de 12081 1726882420.52888: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.056865", "end": "2024-09-20 21:33:40.464595", "rc": 0, "start": "2024-09-20 21:33:40.407730" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 12081 1726882420.52956: no more pending results, returning what we have 12081 1726882420.52959: results queue empty 12081 1726882420.52960: checking for any_errors_fatal 12081 1726882420.52976: done checking for any_errors_fatal 12081 1726882420.52977: checking for max_fail_percentage 12081 1726882420.52978: done checking for max_fail_percentage 12081 1726882420.52979: checking to see if all hosts have failed and the running result is not ok 12081 1726882420.52980: done checking to see if all hosts have failed 12081 1726882420.52981: getting the remaining hosts for this loop 12081 1726882420.52983: done getting the remaining hosts for this loop 12081 1726882420.52987: getting the next task for host managed_node3 12081 1726882420.52993: done getting next task for host managed_node3 12081 1726882420.52996: ^ task is: TASK: Stop dnsmasq/radvd services 12081 1726882420.53001: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882420.53005: getting variables 12081 1726882420.53007: in VariableManager get_vars() 12081 1726882420.53047: Calling all_inventory to load vars for managed_node3 12081 1726882420.53049: Calling groups_inventory to load vars for managed_node3 12081 1726882420.53052: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882420.53082: Calling all_plugins_play to load vars for managed_node3 12081 1726882420.53086: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882420.53089: Calling groups_plugins_play to load vars for managed_node3 12081 1726882420.54731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882420.58088: done with get_vars() 12081 1726882420.58119: done getting variables 12081 1726882420.58181: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:33:40 -0400 (0:00:00.504) 0:00:40.385 ****** 12081 1726882420.58214: entering _queue_task() for managed_node3/shell 12081 1726882420.58556: worker is 1 (out of 1 available) 12081 1726882420.58571: exiting _queue_task() for managed_node3/shell 12081 1726882420.58584: done queuing things up, now waiting for results queue to drain 12081 1726882420.58585: waiting for pending results... 12081 1726882420.58959: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 12081 1726882420.59094: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000006df 12081 1726882420.59115: variable 'ansible_search_path' from source: unknown 12081 1726882420.59125: variable 'ansible_search_path' from source: unknown 12081 1726882420.59166: calling self._execute() 12081 1726882420.59272: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882420.59281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882420.59292: variable 'omit' from source: magic vars 12081 1726882420.59657: variable 'ansible_distribution_major_version' from source: facts 12081 1726882420.59682: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882420.59693: variable 'omit' from source: magic vars 12081 1726882420.59767: variable 'omit' from source: magic vars 12081 1726882420.59803: variable 'omit' from source: magic vars 12081 1726882420.59848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882420.59896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882420.59923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882420.59946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882420.59962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882420.60004: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882420.60012: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882420.60019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882420.60134: Set connection var ansible_pipelining to False 12081 1726882420.60142: Set connection var ansible_shell_type to sh 12081 1726882420.60154: Set connection var ansible_shell_executable to /bin/sh 12081 1726882420.60161: Set connection var ansible_connection to ssh 12081 1726882420.60173: Set connection var ansible_timeout to 10 12081 1726882420.60182: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882420.60214: variable 'ansible_shell_executable' from source: unknown 12081 1726882420.60222: variable 'ansible_connection' from source: unknown 12081 1726882420.60230: variable 'ansible_module_compression' from source: unknown 12081 1726882420.60236: variable 'ansible_shell_type' from source: unknown 12081 1726882420.60242: variable 'ansible_shell_executable' from source: unknown 12081 1726882420.60248: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882420.60256: variable 'ansible_pipelining' from source: unknown 12081 1726882420.60264: variable 'ansible_timeout' from source: unknown 12081 1726882420.60272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882420.60424: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882420.60444: variable 'omit' from source: magic vars 12081 1726882420.60453: starting attempt loop 12081 1726882420.60460: running the handler 12081 1726882420.60599: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882420.60626: _low_level_execute_command(): starting 12081 1726882420.60638: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882420.61398: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.61414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.61430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.61452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.61502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.61516: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.61531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.61550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.61564: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.61580: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.61594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.61609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.61625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.61640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.61651: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.61668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.61746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.61772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.61794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.61929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.63585: stdout chunk (state=3): >>>/root <<< 12081 1726882420.63786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.63790: stdout chunk (state=3): >>><<< 12081 1726882420.63792: stderr chunk (state=3): >>><<< 12081 1726882420.63879: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.63882: _low_level_execute_command(): starting 12081 1726882420.63893: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342 `" && echo ansible-tmp-1726882420.6383488-13968-250963842341342="` echo /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342 `" ) && sleep 0' 12081 1726882420.64897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.64917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.64933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.64951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.64996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.65014: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.65027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.65043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.65052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.65062: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.65076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.65087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.65100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.65110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.65124: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.65138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.65217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.65248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.65267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.65402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.67298: stdout chunk (state=3): >>>ansible-tmp-1726882420.6383488-13968-250963842341342=/root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342 <<< 12081 1726882420.67415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.67512: stderr chunk (state=3): >>><<< 12081 1726882420.67528: stdout chunk (state=3): >>><<< 12081 1726882420.67569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882420.6383488-13968-250963842341342=/root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.67775: variable 'ansible_module_compression' from source: unknown 12081 1726882420.67778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882420.67781: variable 'ansible_facts' from source: unknown 12081 1726882420.67793: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342/AnsiballZ_command.py 12081 1726882420.67955: Sending initial data 12081 1726882420.67958: Sent initial data (156 bytes) 12081 1726882420.69009: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.69024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.69037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.69055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.69108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.69119: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.69132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.69148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.69159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.69172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.69185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.69203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.69220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.69232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.69243: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.69257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.69332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.69353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.69370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.69506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.71275: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882420.71365: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882420.71469: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpf5spsbd9 /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342/AnsiballZ_command.py <<< 12081 1726882420.71565: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882420.72882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.73027: stderr chunk (state=3): >>><<< 12081 1726882420.73030: stdout chunk (state=3): >>><<< 12081 1726882420.73057: done transferring module to remote 12081 1726882420.73063: _low_level_execute_command(): starting 12081 1726882420.73071: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342/ /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342/AnsiballZ_command.py && sleep 0' 12081 1726882420.74067: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.74070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.74086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.74123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882420.74130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882420.74142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.74157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.74161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882420.74175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.74247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.74285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.74389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.76191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.76213: stderr chunk (state=3): >>><<< 12081 1726882420.76216: stdout chunk (state=3): >>><<< 12081 1726882420.76311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.76314: _low_level_execute_command(): starting 12081 1726882420.76317: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342/AnsiballZ_command.py && sleep 0' 12081 1726882420.76887: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.76903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.76918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.76936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.76984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.76999: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.77014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.77033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.77046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.77061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.77080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.77095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.77112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.77125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.77136: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.77150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.77227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.77243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.77261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.77403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.92636: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:40.904323", "end": "2024-09-20 21:33:40.924768", "delta": "0:00:00.020445", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882420.93968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882420.93977: stdout chunk (state=3): >>><<< 12081 1726882420.93979: stderr chunk (state=3): >>><<< 12081 1726882420.94122: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:40.904323", "end": "2024-09-20 21:33:40.924768", "delta": "0:00:00.020445", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882420.94126: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882420.94133: _low_level_execute_command(): starting 12081 1726882420.94136: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882420.6383488-13968-250963842341342/ > /dev/null 2>&1 && sleep 0' 12081 1726882420.94706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882420.94722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.94738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.94757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.94800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.94810: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882420.94821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.94837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882420.94847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882420.94855: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882420.94870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882420.94883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882420.94898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882420.94907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882420.94915: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882420.94926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882420.95000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882420.95022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882420.95035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882420.95165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882420.96983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882420.97055: stderr chunk (state=3): >>><<< 12081 1726882420.97074: stdout chunk (state=3): >>><<< 12081 1726882420.97270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882420.97278: handler run complete 12081 1726882420.97280: Evaluated conditional (False): False 12081 1726882420.97282: attempt loop complete, returning result 12081 1726882420.97285: _execute() done 12081 1726882420.97287: dumping result to json 12081 1726882420.97289: done dumping result, returning 12081 1726882420.97291: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0e448fcc-3ce9-0a3f-ff3c-0000000006df] 12081 1726882420.97293: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006df 12081 1726882420.97377: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000006df 12081 1726882420.97380: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.020445", "end": "2024-09-20 21:33:40.924768", "rc": 0, "start": "2024-09-20 21:33:40.904323" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 12081 1726882420.97457: no more pending results, returning what we have 12081 1726882420.97463: results queue empty 12081 1726882420.97466: checking for any_errors_fatal 12081 1726882420.97479: done checking for any_errors_fatal 12081 1726882420.97480: checking for max_fail_percentage 12081 1726882420.97483: done checking for max_fail_percentage 12081 1726882420.97484: checking to see if all hosts have failed and the running result is not ok 12081 1726882420.97485: done checking to see if all hosts have failed 12081 1726882420.97486: getting the remaining hosts for this loop 12081 1726882420.97488: done getting the remaining hosts for this loop 12081 1726882420.97493: getting the next task for host managed_node3 12081 1726882420.97505: done getting next task for host managed_node3 12081 1726882420.97508: ^ task is: TASK: Reset bond options to assert 12081 1726882420.97511: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882420.97515: getting variables 12081 1726882420.97517: in VariableManager get_vars() 12081 1726882420.97563: Calling all_inventory to load vars for managed_node3 12081 1726882420.97568: Calling groups_inventory to load vars for managed_node3 12081 1726882420.97571: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882420.97584: Calling all_plugins_play to load vars for managed_node3 12081 1726882420.97587: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882420.97591: Calling groups_plugins_play to load vars for managed_node3 12081 1726882420.99572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.01334: done with get_vars() 12081 1726882421.01360: done getting variables 12081 1726882421.01423: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reset bond options to assert] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:59 Friday 20 September 2024 21:33:41 -0400 (0:00:00.432) 0:00:40.817 ****** 12081 1726882421.01454: entering _queue_task() for managed_node3/set_fact 12081 1726882421.01788: worker is 1 (out of 1 available) 12081 1726882421.01799: exiting _queue_task() for managed_node3/set_fact 12081 1726882421.01816: done queuing things up, now waiting for results queue to drain 12081 1726882421.01817: waiting for pending results... 12081 1726882421.02113: running TaskExecutor() for managed_node3/TASK: Reset bond options to assert 12081 1726882421.02216: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000000f 12081 1726882421.02244: variable 'ansible_search_path' from source: unknown 12081 1726882421.02300: calling self._execute() 12081 1726882421.02422: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.02434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.02448: variable 'omit' from source: magic vars 12081 1726882421.02880: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.02905: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.02918: variable 'omit' from source: magic vars 12081 1726882421.02958: variable 'omit' from source: magic vars 12081 1726882421.03005: variable 'dhcp_interface1' from source: play vars 12081 1726882421.03092: variable 'dhcp_interface1' from source: play vars 12081 1726882421.03118: variable 'omit' from source: magic vars 12081 1726882421.03176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882421.03216: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.03249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882421.03279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.03296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.03329: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.03340: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.03355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.03474: Set connection var ansible_pipelining to False 12081 1726882421.03481: Set connection var ansible_shell_type to sh 12081 1726882421.03494: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.03501: Set connection var ansible_connection to ssh 12081 1726882421.03511: Set connection var ansible_timeout to 10 12081 1726882421.03519: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.03547: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.03561: variable 'ansible_connection' from source: unknown 12081 1726882421.03576: variable 'ansible_module_compression' from source: unknown 12081 1726882421.03584: variable 'ansible_shell_type' from source: unknown 12081 1726882421.03591: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.03598: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.03606: variable 'ansible_pipelining' from source: unknown 12081 1726882421.03612: variable 'ansible_timeout' from source: unknown 12081 1726882421.03619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.03773: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.03797: variable 'omit' from source: magic vars 12081 1726882421.03810: starting attempt loop 12081 1726882421.03817: running the handler 12081 1726882421.03833: handler run complete 12081 1726882421.03848: attempt loop complete, returning result 12081 1726882421.03858: _execute() done 12081 1726882421.03870: dumping result to json 12081 1726882421.03879: done dumping result, returning 12081 1726882421.03894: done running TaskExecutor() for managed_node3/TASK: Reset bond options to assert [0e448fcc-3ce9-0a3f-ff3c-00000000000f] 12081 1726882421.03910: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000000f ok: [managed_node3] => { "ansible_facts": { "bond_options_to_assert": [ { "key": "mode", "value": "active-backup" }, { "key": "arp_interval", "value": "60" }, { "key": "arp_ip_target", "value": "192.0.2.128" }, { "key": "arp_validate", "value": "none" }, { "key": "primary", "value": "test1" } ] }, "changed": false } 12081 1726882421.04104: no more pending results, returning what we have 12081 1726882421.04109: results queue empty 12081 1726882421.04110: checking for any_errors_fatal 12081 1726882421.04125: done checking for any_errors_fatal 12081 1726882421.04128: checking for max_fail_percentage 12081 1726882421.04130: done checking for max_fail_percentage 12081 1726882421.04131: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.04132: done checking to see if all hosts have failed 12081 1726882421.04133: getting the remaining hosts for this loop 12081 1726882421.04135: done getting the remaining hosts for this loop 12081 1726882421.04139: getting the next task for host managed_node3 12081 1726882421.04149: done getting next task for host managed_node3 12081 1726882421.04155: ^ task is: TASK: Include the task 'run_test.yml' 12081 1726882421.04158: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.04162: getting variables 12081 1726882421.04166: in VariableManager get_vars() 12081 1726882421.04207: Calling all_inventory to load vars for managed_node3 12081 1726882421.04210: Calling groups_inventory to load vars for managed_node3 12081 1726882421.04212: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.04225: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.04228: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.04231: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.05224: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000000f 12081 1726882421.05227: WORKER PROCESS EXITING 12081 1726882421.06050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.07820: done with get_vars() 12081 1726882421.07856: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:72 Friday 20 September 2024 21:33:41 -0400 (0:00:00.064) 0:00:40.882 ****** 12081 1726882421.07962: entering _queue_task() for managed_node3/include_tasks 12081 1726882421.08307: worker is 1 (out of 1 available) 12081 1726882421.08320: exiting _queue_task() for managed_node3/include_tasks 12081 1726882421.08336: done queuing things up, now waiting for results queue to drain 12081 1726882421.08338: waiting for pending results... 12081 1726882421.08670: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 12081 1726882421.08792: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000011 12081 1726882421.08812: variable 'ansible_search_path' from source: unknown 12081 1726882421.08863: calling self._execute() 12081 1726882421.08979: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.08996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.09015: variable 'omit' from source: magic vars 12081 1726882421.09432: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.09459: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.09472: _execute() done 12081 1726882421.09482: dumping result to json 12081 1726882421.09494: done dumping result, returning 12081 1726882421.09505: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000011] 12081 1726882421.09517: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000011 12081 1726882421.09686: no more pending results, returning what we have 12081 1726882421.09691: in VariableManager get_vars() 12081 1726882421.09739: Calling all_inventory to load vars for managed_node3 12081 1726882421.09742: Calling groups_inventory to load vars for managed_node3 12081 1726882421.09744: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.09764: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.09770: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.09774: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.10819: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000011 12081 1726882421.10822: WORKER PROCESS EXITING 12081 1726882421.11721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.13450: done with get_vars() 12081 1726882421.13484: variable 'ansible_search_path' from source: unknown 12081 1726882421.13501: we have included files to process 12081 1726882421.13502: generating all_blocks data 12081 1726882421.13504: done generating all_blocks data 12081 1726882421.13511: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12081 1726882421.13513: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12081 1726882421.13515: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12081 1726882421.13971: in VariableManager get_vars() 12081 1726882421.13994: done with get_vars() 12081 1726882421.14042: in VariableManager get_vars() 12081 1726882421.14068: done with get_vars() 12081 1726882421.14110: in VariableManager get_vars() 12081 1726882421.14134: done with get_vars() 12081 1726882421.14183: in VariableManager get_vars() 12081 1726882421.14202: done with get_vars() 12081 1726882421.14248: in VariableManager get_vars() 12081 1726882421.14272: done with get_vars() 12081 1726882421.14691: in VariableManager get_vars() 12081 1726882421.14710: done with get_vars() 12081 1726882421.14723: done processing included file 12081 1726882421.14725: iterating over new_blocks loaded from include file 12081 1726882421.14726: in VariableManager get_vars() 12081 1726882421.14740: done with get_vars() 12081 1726882421.14742: filtering new block on tags 12081 1726882421.14858: done filtering new block on tags 12081 1726882421.14861: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 12081 1726882421.14869: extending task lists for all hosts with included blocks 12081 1726882421.14910: done extending task lists 12081 1726882421.14912: done processing included files 12081 1726882421.14913: results queue empty 12081 1726882421.14913: checking for any_errors_fatal 12081 1726882421.14916: done checking for any_errors_fatal 12081 1726882421.14917: checking for max_fail_percentage 12081 1726882421.14918: done checking for max_fail_percentage 12081 1726882421.14919: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.14920: done checking to see if all hosts have failed 12081 1726882421.14921: getting the remaining hosts for this loop 12081 1726882421.14922: done getting the remaining hosts for this loop 12081 1726882421.14925: getting the next task for host managed_node3 12081 1726882421.14928: done getting next task for host managed_node3 12081 1726882421.14930: ^ task is: TASK: TEST: {{ lsr_description }} 12081 1726882421.14933: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.14936: getting variables 12081 1726882421.14937: in VariableManager get_vars() 12081 1726882421.14947: Calling all_inventory to load vars for managed_node3 12081 1726882421.14950: Calling groups_inventory to load vars for managed_node3 12081 1726882421.14955: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.14960: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.14962: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.14967: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.16314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.18036: done with get_vars() 12081 1726882421.18067: done getting variables 12081 1726882421.18112: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882421.18224: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:33:41 -0400 (0:00:00.102) 0:00:40.985 ****** 12081 1726882421.18266: entering _queue_task() for managed_node3/debug 12081 1726882421.18628: worker is 1 (out of 1 available) 12081 1726882421.18641: exiting _queue_task() for managed_node3/debug 12081 1726882421.18657: done queuing things up, now waiting for results queue to drain 12081 1726882421.18659: waiting for pending results... 12081 1726882421.18994: running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 12081 1726882421.19130: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008ea 12081 1726882421.19157: variable 'ansible_search_path' from source: unknown 12081 1726882421.19170: variable 'ansible_search_path' from source: unknown 12081 1726882421.19217: calling self._execute() 12081 1726882421.19330: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.19345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.19368: variable 'omit' from source: magic vars 12081 1726882421.19784: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.19804: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.19815: variable 'omit' from source: magic vars 12081 1726882421.19857: variable 'omit' from source: magic vars 12081 1726882421.19971: variable 'lsr_description' from source: include params 12081 1726882421.20004: variable 'omit' from source: magic vars 12081 1726882421.20055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882421.20101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.20130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882421.20155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.20175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.20219: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.20228: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.20236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.20357: Set connection var ansible_pipelining to False 12081 1726882421.20368: Set connection var ansible_shell_type to sh 12081 1726882421.20383: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.20392: Set connection var ansible_connection to ssh 12081 1726882421.20403: Set connection var ansible_timeout to 10 12081 1726882421.20413: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.20451: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.20466: variable 'ansible_connection' from source: unknown 12081 1726882421.20475: variable 'ansible_module_compression' from source: unknown 12081 1726882421.20482: variable 'ansible_shell_type' from source: unknown 12081 1726882421.20489: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.20496: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.20504: variable 'ansible_pipelining' from source: unknown 12081 1726882421.20509: variable 'ansible_timeout' from source: unknown 12081 1726882421.20515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.20664: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.20682: variable 'omit' from source: magic vars 12081 1726882421.20690: starting attempt loop 12081 1726882421.20695: running the handler 12081 1726882421.20743: handler run complete 12081 1726882421.20771: attempt loop complete, returning result 12081 1726882421.20777: _execute() done 12081 1726882421.20782: dumping result to json 12081 1726882421.20788: done dumping result, returning 12081 1726882421.20798: done running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0e448fcc-3ce9-0a3f-ff3c-0000000008ea] 12081 1726882421.20808: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ea ok: [managed_node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 12081 1726882421.20948: no more pending results, returning what we have 12081 1726882421.20954: results queue empty 12081 1726882421.20955: checking for any_errors_fatal 12081 1726882421.20957: done checking for any_errors_fatal 12081 1726882421.20957: checking for max_fail_percentage 12081 1726882421.20959: done checking for max_fail_percentage 12081 1726882421.20960: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.20961: done checking to see if all hosts have failed 12081 1726882421.20962: getting the remaining hosts for this loop 12081 1726882421.20965: done getting the remaining hosts for this loop 12081 1726882421.20969: getting the next task for host managed_node3 12081 1726882421.20976: done getting next task for host managed_node3 12081 1726882421.20979: ^ task is: TASK: Show item 12081 1726882421.20983: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.20987: getting variables 12081 1726882421.20989: in VariableManager get_vars() 12081 1726882421.21029: Calling all_inventory to load vars for managed_node3 12081 1726882421.21032: Calling groups_inventory to load vars for managed_node3 12081 1726882421.21034: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.21047: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.21050: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.21055: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.22156: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ea 12081 1726882421.22160: WORKER PROCESS EXITING 12081 1726882421.22903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.24691: done with get_vars() 12081 1726882421.24722: done getting variables 12081 1726882421.24795: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:33:41 -0400 (0:00:00.065) 0:00:41.051 ****** 12081 1726882421.24832: entering _queue_task() for managed_node3/debug 12081 1726882421.25179: worker is 1 (out of 1 available) 12081 1726882421.25192: exiting _queue_task() for managed_node3/debug 12081 1726882421.25204: done queuing things up, now waiting for results queue to drain 12081 1726882421.25205: waiting for pending results... 12081 1726882421.25526: running TaskExecutor() for managed_node3/TASK: Show item 12081 1726882421.25646: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008eb 12081 1726882421.25679: variable 'ansible_search_path' from source: unknown 12081 1726882421.25688: variable 'ansible_search_path' from source: unknown 12081 1726882421.25746: variable 'omit' from source: magic vars 12081 1726882421.25912: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.25925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.25937: variable 'omit' from source: magic vars 12081 1726882421.26305: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.26327: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.26336: variable 'omit' from source: magic vars 12081 1726882421.26379: variable 'omit' from source: magic vars 12081 1726882421.26427: variable 'item' from source: unknown 12081 1726882421.26493: variable 'item' from source: unknown 12081 1726882421.26513: variable 'omit' from source: magic vars 12081 1726882421.26572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882421.26611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.26644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882421.26672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.26688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.26722: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.26731: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.26745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.26864: Set connection var ansible_pipelining to False 12081 1726882421.26875: Set connection var ansible_shell_type to sh 12081 1726882421.26888: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.26894: Set connection var ansible_connection to ssh 12081 1726882421.26903: Set connection var ansible_timeout to 10 12081 1726882421.26912: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.26937: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.26946: variable 'ansible_connection' from source: unknown 12081 1726882421.26961: variable 'ansible_module_compression' from source: unknown 12081 1726882421.26974: variable 'ansible_shell_type' from source: unknown 12081 1726882421.26982: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.26989: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.26997: variable 'ansible_pipelining' from source: unknown 12081 1726882421.27003: variable 'ansible_timeout' from source: unknown 12081 1726882421.27012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.27168: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.27193: variable 'omit' from source: magic vars 12081 1726882421.27204: starting attempt loop 12081 1726882421.27211: running the handler 12081 1726882421.27268: variable 'lsr_description' from source: include params 12081 1726882421.27345: variable 'lsr_description' from source: include params 12081 1726882421.27366: handler run complete 12081 1726882421.27389: attempt loop complete, returning result 12081 1726882421.27417: variable 'item' from source: unknown 12081 1726882421.27487: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 12081 1726882421.27745: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.27766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.27782: variable 'omit' from source: magic vars 12081 1726882421.27960: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.27973: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.27981: variable 'omit' from source: magic vars 12081 1726882421.27998: variable 'omit' from source: magic vars 12081 1726882421.28047: variable 'item' from source: unknown 12081 1726882421.28121: variable 'item' from source: unknown 12081 1726882421.28139: variable 'omit' from source: magic vars 12081 1726882421.28167: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.28181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.28191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.28207: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.28215: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.28229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.28310: Set connection var ansible_pipelining to False 12081 1726882421.28316: Set connection var ansible_shell_type to sh 12081 1726882421.28327: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.28336: Set connection var ansible_connection to ssh 12081 1726882421.28343: Set connection var ansible_timeout to 10 12081 1726882421.28349: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.28373: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.28379: variable 'ansible_connection' from source: unknown 12081 1726882421.28384: variable 'ansible_module_compression' from source: unknown 12081 1726882421.28389: variable 'ansible_shell_type' from source: unknown 12081 1726882421.28393: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.28398: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.28403: variable 'ansible_pipelining' from source: unknown 12081 1726882421.28407: variable 'ansible_timeout' from source: unknown 12081 1726882421.28412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.28507: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.28522: variable 'omit' from source: magic vars 12081 1726882421.28530: starting attempt loop 12081 1726882421.28536: running the handler 12081 1726882421.28574: variable 'lsr_setup' from source: include params 12081 1726882421.28639: variable 'lsr_setup' from source: include params 12081 1726882421.28701: handler run complete 12081 1726882421.28725: attempt loop complete, returning result 12081 1726882421.28740: variable 'item' from source: unknown 12081 1726882421.28808: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 12081 1726882421.28992: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.29007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.29021: variable 'omit' from source: magic vars 12081 1726882421.29198: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.29208: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.29216: variable 'omit' from source: magic vars 12081 1726882421.29235: variable 'omit' from source: magic vars 12081 1726882421.29292: variable 'item' from source: unknown 12081 1726882421.29365: variable 'item' from source: unknown 12081 1726882421.29386: variable 'omit' from source: magic vars 12081 1726882421.29408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.29420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.29432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.29447: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.29457: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.29473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.29554: Set connection var ansible_pipelining to False 12081 1726882421.29565: Set connection var ansible_shell_type to sh 12081 1726882421.29585: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.29592: Set connection var ansible_connection to ssh 12081 1726882421.29602: Set connection var ansible_timeout to 10 12081 1726882421.29611: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.29635: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.29642: variable 'ansible_connection' from source: unknown 12081 1726882421.29650: variable 'ansible_module_compression' from source: unknown 12081 1726882421.29661: variable 'ansible_shell_type' from source: unknown 12081 1726882421.29671: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.29677: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.29691: variable 'ansible_pipelining' from source: unknown 12081 1726882421.29699: variable 'ansible_timeout' from source: unknown 12081 1726882421.29706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.29810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.29823: variable 'omit' from source: magic vars 12081 1726882421.29831: starting attempt loop 12081 1726882421.29838: running the handler 12081 1726882421.29866: variable 'lsr_test' from source: include params 12081 1726882421.29938: variable 'lsr_test' from source: include params 12081 1726882421.29967: handler run complete 12081 1726882421.29987: attempt loop complete, returning result 12081 1726882421.30006: variable 'item' from source: unknown 12081 1726882421.30079: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile_reconfigure.yml" ] } 12081 1726882421.30241: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.30257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.30273: variable 'omit' from source: magic vars 12081 1726882421.30441: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.30456: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.30468: variable 'omit' from source: magic vars 12081 1726882421.30487: variable 'omit' from source: magic vars 12081 1726882421.30539: variable 'item' from source: unknown 12081 1726882421.30605: variable 'item' from source: unknown 12081 1726882421.30632: variable 'omit' from source: magic vars 12081 1726882421.30657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.30674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.30686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.30702: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.30709: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.30716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.30803: Set connection var ansible_pipelining to False 12081 1726882421.30812: Set connection var ansible_shell_type to sh 12081 1726882421.30824: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.30832: Set connection var ansible_connection to ssh 12081 1726882421.30848: Set connection var ansible_timeout to 10 12081 1726882421.30860: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.30887: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.30896: variable 'ansible_connection' from source: unknown 12081 1726882421.30903: variable 'ansible_module_compression' from source: unknown 12081 1726882421.30910: variable 'ansible_shell_type' from source: unknown 12081 1726882421.30916: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.30923: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.30930: variable 'ansible_pipelining' from source: unknown 12081 1726882421.30937: variable 'ansible_timeout' from source: unknown 12081 1726882421.30954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.31041: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.31062: variable 'omit' from source: magic vars 12081 1726882421.31072: starting attempt loop 12081 1726882421.31078: running the handler 12081 1726882421.31097: variable 'lsr_assert' from source: include params 12081 1726882421.31161: variable 'lsr_assert' from source: include params 12081 1726882421.31186: handler run complete 12081 1726882421.31201: attempt loop complete, returning result 12081 1726882421.31216: variable 'item' from source: unknown 12081 1726882421.31284: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_bond_options.yml" ] } 12081 1726882421.31429: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.31441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.31456: variable 'omit' from source: magic vars 12081 1726882421.31974: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.31986: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.31994: variable 'omit' from source: magic vars 12081 1726882421.32011: variable 'omit' from source: magic vars 12081 1726882421.32067: variable 'item' from source: unknown 12081 1726882421.32131: variable 'item' from source: unknown 12081 1726882421.32161: variable 'omit' from source: magic vars 12081 1726882421.32185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.32198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.32208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.32224: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.32232: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.32239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.32324: Set connection var ansible_pipelining to False 12081 1726882421.32332: Set connection var ansible_shell_type to sh 12081 1726882421.32344: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.32351: Set connection var ansible_connection to ssh 12081 1726882421.32372: Set connection var ansible_timeout to 10 12081 1726882421.32383: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.32408: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.32416: variable 'ansible_connection' from source: unknown 12081 1726882421.32423: variable 'ansible_module_compression' from source: unknown 12081 1726882421.32430: variable 'ansible_shell_type' from source: unknown 12081 1726882421.32437: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.32443: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.32451: variable 'ansible_pipelining' from source: unknown 12081 1726882421.32462: variable 'ansible_timeout' from source: unknown 12081 1726882421.32476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.32570: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.32588: variable 'omit' from source: magic vars 12081 1726882421.32600: starting attempt loop 12081 1726882421.32606: running the handler 12081 1726882421.32723: handler run complete 12081 1726882421.32740: attempt loop complete, returning result 12081 1726882421.32762: variable 'item' from source: unknown 12081 1726882421.32832: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 12081 1726882421.32990: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.33004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.33019: variable 'omit' from source: magic vars 12081 1726882421.33190: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.33200: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.33208: variable 'omit' from source: magic vars 12081 1726882421.33225: variable 'omit' from source: magic vars 12081 1726882421.33277: variable 'item' from source: unknown 12081 1726882421.33339: variable 'item' from source: unknown 12081 1726882421.33368: variable 'omit' from source: magic vars 12081 1726882421.33391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.33402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.33411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.33424: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.33431: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.33437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.33521: Set connection var ansible_pipelining to False 12081 1726882421.33528: Set connection var ansible_shell_type to sh 12081 1726882421.33540: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.33546: Set connection var ansible_connection to ssh 12081 1726882421.33560: Set connection var ansible_timeout to 10 12081 1726882421.33572: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.33602: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.33610: variable 'ansible_connection' from source: unknown 12081 1726882421.33617: variable 'ansible_module_compression' from source: unknown 12081 1726882421.33624: variable 'ansible_shell_type' from source: unknown 12081 1726882421.33631: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.33638: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.33647: variable 'ansible_pipelining' from source: unknown 12081 1726882421.33656: variable 'ansible_timeout' from source: unknown 12081 1726882421.33666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.33747: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.33760: variable 'omit' from source: magic vars 12081 1726882421.33769: starting attempt loop 12081 1726882421.33774: running the handler 12081 1726882421.33798: variable 'lsr_fail_debug' from source: play vars 12081 1726882421.33857: variable 'lsr_fail_debug' from source: play vars 12081 1726882421.33880: handler run complete 12081 1726882421.33897: attempt loop complete, returning result 12081 1726882421.33919: variable 'item' from source: unknown 12081 1726882421.33981: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 12081 1726882421.34119: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.34134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.34148: variable 'omit' from source: magic vars 12081 1726882421.34323: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.34334: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.34342: variable 'omit' from source: magic vars 12081 1726882421.34365: variable 'omit' from source: magic vars 12081 1726882421.34416: variable 'item' from source: unknown 12081 1726882421.34485: variable 'item' from source: unknown 12081 1726882421.34511: variable 'omit' from source: magic vars 12081 1726882421.34533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.34545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.34566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.34582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.34589: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.34596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.34683: Set connection var ansible_pipelining to False 12081 1726882421.34691: Set connection var ansible_shell_type to sh 12081 1726882421.34703: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.34710: Set connection var ansible_connection to ssh 12081 1726882421.34725: Set connection var ansible_timeout to 10 12081 1726882421.34736: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.34764: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.34773: variable 'ansible_connection' from source: unknown 12081 1726882421.34780: variable 'ansible_module_compression' from source: unknown 12081 1726882421.34787: variable 'ansible_shell_type' from source: unknown 12081 1726882421.34794: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.34801: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.34808: variable 'ansible_pipelining' from source: unknown 12081 1726882421.34815: variable 'ansible_timeout' from source: unknown 12081 1726882421.34822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.34920: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.34933: variable 'omit' from source: magic vars 12081 1726882421.34949: starting attempt loop 12081 1726882421.34959: running the handler 12081 1726882421.34984: variable 'lsr_cleanup' from source: include params 12081 1726882421.35058: variable 'lsr_cleanup' from source: include params 12081 1726882421.35085: handler run complete 12081 1726882421.35103: attempt loop complete, returning result 12081 1726882421.35122: variable 'item' from source: unknown 12081 1726882421.35195: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml", "tasks/check_network_dns.yml" ] } 12081 1726882421.35306: dumping result to json 12081 1726882421.35320: done dumping result, returning 12081 1726882421.35332: done running TaskExecutor() for managed_node3/TASK: Show item [0e448fcc-3ce9-0a3f-ff3c-0000000008eb] 12081 1726882421.35344: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008eb 12081 1726882421.35484: no more pending results, returning what we have 12081 1726882421.35488: results queue empty 12081 1726882421.35489: checking for any_errors_fatal 12081 1726882421.35497: done checking for any_errors_fatal 12081 1726882421.35498: checking for max_fail_percentage 12081 1726882421.35500: done checking for max_fail_percentage 12081 1726882421.35501: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.35502: done checking to see if all hosts have failed 12081 1726882421.35503: getting the remaining hosts for this loop 12081 1726882421.35505: done getting the remaining hosts for this loop 12081 1726882421.35509: getting the next task for host managed_node3 12081 1726882421.35516: done getting next task for host managed_node3 12081 1726882421.35519: ^ task is: TASK: Include the task 'show_interfaces.yml' 12081 1726882421.35523: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.35527: getting variables 12081 1726882421.35529: in VariableManager get_vars() 12081 1726882421.35573: Calling all_inventory to load vars for managed_node3 12081 1726882421.35576: Calling groups_inventory to load vars for managed_node3 12081 1726882421.35578: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.35591: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.35593: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.35597: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.36604: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008eb 12081 1726882421.36607: WORKER PROCESS EXITING 12081 1726882421.37543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.39601: done with get_vars() 12081 1726882421.39637: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:33:41 -0400 (0:00:00.149) 0:00:41.200 ****** 12081 1726882421.39739: entering _queue_task() for managed_node3/include_tasks 12081 1726882421.40081: worker is 1 (out of 1 available) 12081 1726882421.40094: exiting _queue_task() for managed_node3/include_tasks 12081 1726882421.40108: done queuing things up, now waiting for results queue to drain 12081 1726882421.40109: waiting for pending results... 12081 1726882421.41081: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 12081 1726882421.41175: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008ec 12081 1726882421.41190: variable 'ansible_search_path' from source: unknown 12081 1726882421.41196: variable 'ansible_search_path' from source: unknown 12081 1726882421.41232: calling self._execute() 12081 1726882421.41428: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.41432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.41446: variable 'omit' from source: magic vars 12081 1726882421.42312: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.42325: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.42331: _execute() done 12081 1726882421.42334: dumping result to json 12081 1726882421.42337: done dumping result, returning 12081 1726882421.42342: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-0a3f-ff3c-0000000008ec] 12081 1726882421.42351: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ec 12081 1726882421.42655: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ec 12081 1726882421.42658: WORKER PROCESS EXITING 12081 1726882421.42689: no more pending results, returning what we have 12081 1726882421.42694: in VariableManager get_vars() 12081 1726882421.42739: Calling all_inventory to load vars for managed_node3 12081 1726882421.42742: Calling groups_inventory to load vars for managed_node3 12081 1726882421.42744: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.42760: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.42765: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.42768: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.44599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.46483: done with get_vars() 12081 1726882421.46507: variable 'ansible_search_path' from source: unknown 12081 1726882421.46508: variable 'ansible_search_path' from source: unknown 12081 1726882421.46550: we have included files to process 12081 1726882421.46551: generating all_blocks data 12081 1726882421.46558: done generating all_blocks data 12081 1726882421.46563: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12081 1726882421.46566: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12081 1726882421.46569: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12081 1726882421.46688: in VariableManager get_vars() 12081 1726882421.46712: done with get_vars() 12081 1726882421.46836: done processing included file 12081 1726882421.46838: iterating over new_blocks loaded from include file 12081 1726882421.46840: in VariableManager get_vars() 12081 1726882421.46859: done with get_vars() 12081 1726882421.46860: filtering new block on tags 12081 1726882421.46908: done filtering new block on tags 12081 1726882421.46911: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 12081 1726882421.46916: extending task lists for all hosts with included blocks 12081 1726882421.47456: done extending task lists 12081 1726882421.47458: done processing included files 12081 1726882421.47459: results queue empty 12081 1726882421.47459: checking for any_errors_fatal 12081 1726882421.47468: done checking for any_errors_fatal 12081 1726882421.47469: checking for max_fail_percentage 12081 1726882421.47470: done checking for max_fail_percentage 12081 1726882421.47471: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.47472: done checking to see if all hosts have failed 12081 1726882421.47472: getting the remaining hosts for this loop 12081 1726882421.47474: done getting the remaining hosts for this loop 12081 1726882421.47476: getting the next task for host managed_node3 12081 1726882421.47480: done getting next task for host managed_node3 12081 1726882421.47482: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 12081 1726882421.47485: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.47488: getting variables 12081 1726882421.47489: in VariableManager get_vars() 12081 1726882421.47500: Calling all_inventory to load vars for managed_node3 12081 1726882421.47503: Calling groups_inventory to load vars for managed_node3 12081 1726882421.47505: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.47511: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.47513: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.47516: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.49101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.50751: done with get_vars() 12081 1726882421.50784: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:33:41 -0400 (0:00:00.111) 0:00:41.311 ****** 12081 1726882421.50871: entering _queue_task() for managed_node3/include_tasks 12081 1726882421.51219: worker is 1 (out of 1 available) 12081 1726882421.51231: exiting _queue_task() for managed_node3/include_tasks 12081 1726882421.51246: done queuing things up, now waiting for results queue to drain 12081 1726882421.51248: waiting for pending results... 12081 1726882421.51547: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 12081 1726882421.51655: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000913 12081 1726882421.51669: variable 'ansible_search_path' from source: unknown 12081 1726882421.51676: variable 'ansible_search_path' from source: unknown 12081 1726882421.51713: calling self._execute() 12081 1726882421.51812: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.51817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.51826: variable 'omit' from source: magic vars 12081 1726882421.52173: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.52185: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.52192: _execute() done 12081 1726882421.52195: dumping result to json 12081 1726882421.52197: done dumping result, returning 12081 1726882421.52205: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000913] 12081 1726882421.52212: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000913 12081 1726882421.52310: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000913 12081 1726882421.52313: WORKER PROCESS EXITING 12081 1726882421.52357: no more pending results, returning what we have 12081 1726882421.52362: in VariableManager get_vars() 12081 1726882421.52411: Calling all_inventory to load vars for managed_node3 12081 1726882421.52414: Calling groups_inventory to load vars for managed_node3 12081 1726882421.52417: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.52432: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.52436: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.52439: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.54157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.55815: done with get_vars() 12081 1726882421.55842: variable 'ansible_search_path' from source: unknown 12081 1726882421.55844: variable 'ansible_search_path' from source: unknown 12081 1726882421.55886: we have included files to process 12081 1726882421.55887: generating all_blocks data 12081 1726882421.55888: done generating all_blocks data 12081 1726882421.55890: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12081 1726882421.55891: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12081 1726882421.55893: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12081 1726882421.56167: done processing included file 12081 1726882421.56169: iterating over new_blocks loaded from include file 12081 1726882421.56170: in VariableManager get_vars() 12081 1726882421.56191: done with get_vars() 12081 1726882421.56193: filtering new block on tags 12081 1726882421.56231: done filtering new block on tags 12081 1726882421.56233: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 12081 1726882421.56238: extending task lists for all hosts with included blocks 12081 1726882421.56401: done extending task lists 12081 1726882421.56402: done processing included files 12081 1726882421.56403: results queue empty 12081 1726882421.56404: checking for any_errors_fatal 12081 1726882421.56407: done checking for any_errors_fatal 12081 1726882421.56408: checking for max_fail_percentage 12081 1726882421.56409: done checking for max_fail_percentage 12081 1726882421.56410: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.56411: done checking to see if all hosts have failed 12081 1726882421.56412: getting the remaining hosts for this loop 12081 1726882421.56413: done getting the remaining hosts for this loop 12081 1726882421.56416: getting the next task for host managed_node3 12081 1726882421.56420: done getting next task for host managed_node3 12081 1726882421.56422: ^ task is: TASK: Gather current interface info 12081 1726882421.56426: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.56428: getting variables 12081 1726882421.56429: in VariableManager get_vars() 12081 1726882421.56441: Calling all_inventory to load vars for managed_node3 12081 1726882421.56443: Calling groups_inventory to load vars for managed_node3 12081 1726882421.56445: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.56450: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.56452: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.56455: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.57782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.59489: done with get_vars() 12081 1726882421.59516: done getting variables 12081 1726882421.59569: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:33:41 -0400 (0:00:00.087) 0:00:41.398 ****** 12081 1726882421.59604: entering _queue_task() for managed_node3/command 12081 1726882421.59973: worker is 1 (out of 1 available) 12081 1726882421.59988: exiting _queue_task() for managed_node3/command 12081 1726882421.60003: done queuing things up, now waiting for results queue to drain 12081 1726882421.60004: waiting for pending results... 12081 1726882421.60316: running TaskExecutor() for managed_node3/TASK: Gather current interface info 12081 1726882421.60420: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000094e 12081 1726882421.60437: variable 'ansible_search_path' from source: unknown 12081 1726882421.60440: variable 'ansible_search_path' from source: unknown 12081 1726882421.60484: calling self._execute() 12081 1726882421.60589: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.60592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.60603: variable 'omit' from source: magic vars 12081 1726882421.60958: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.60970: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.60976: variable 'omit' from source: magic vars 12081 1726882421.61025: variable 'omit' from source: magic vars 12081 1726882421.61058: variable 'omit' from source: magic vars 12081 1726882421.61103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882421.61136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.61158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882421.61176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.61187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.61222: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.61225: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.61228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.61330: Set connection var ansible_pipelining to False 12081 1726882421.61333: Set connection var ansible_shell_type to sh 12081 1726882421.61339: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.61342: Set connection var ansible_connection to ssh 12081 1726882421.61347: Set connection var ansible_timeout to 10 12081 1726882421.61354: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.61380: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.61383: variable 'ansible_connection' from source: unknown 12081 1726882421.61387: variable 'ansible_module_compression' from source: unknown 12081 1726882421.61389: variable 'ansible_shell_type' from source: unknown 12081 1726882421.61392: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.61394: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.61396: variable 'ansible_pipelining' from source: unknown 12081 1726882421.61398: variable 'ansible_timeout' from source: unknown 12081 1726882421.61400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.61542: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.61556: variable 'omit' from source: magic vars 12081 1726882421.61559: starting attempt loop 12081 1726882421.61562: running the handler 12081 1726882421.61581: _low_level_execute_command(): starting 12081 1726882421.61589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882421.62341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882421.62357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.62372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.62383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.62427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882421.62434: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882421.62444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.62458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882421.62467: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882421.62476: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882421.62484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.62493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.62506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.62515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882421.62524: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882421.62537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.62605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882421.62623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882421.62631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882421.62778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882421.64485: stdout chunk (state=3): >>>/root <<< 12081 1726882421.64585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882421.64689: stderr chunk (state=3): >>><<< 12081 1726882421.64692: stdout chunk (state=3): >>><<< 12081 1726882421.64718: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882421.64733: _low_level_execute_command(): starting 12081 1726882421.64740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671 `" && echo ansible-tmp-1726882421.6471815-14010-130111486342671="` echo /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671 `" ) && sleep 0' 12081 1726882421.65366: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882421.65376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.65385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.65398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.65435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882421.65442: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882421.65451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.65466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882421.65473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882421.65480: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882421.65486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.65495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.65505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.65511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882421.65517: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882421.65529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.65595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882421.65609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882421.65615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882421.65748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882421.67665: stdout chunk (state=3): >>>ansible-tmp-1726882421.6471815-14010-130111486342671=/root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671 <<< 12081 1726882421.67783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882421.67835: stderr chunk (state=3): >>><<< 12081 1726882421.67838: stdout chunk (state=3): >>><<< 12081 1726882421.67859: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882421.6471815-14010-130111486342671=/root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882421.67896: variable 'ansible_module_compression' from source: unknown 12081 1726882421.67973: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882421.68026: variable 'ansible_facts' from source: unknown 12081 1726882421.68079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671/AnsiballZ_command.py 12081 1726882421.68221: Sending initial data 12081 1726882421.68224: Sent initial data (156 bytes) 12081 1726882421.69073: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882421.69082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.69092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.69105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.69142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882421.69149: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882421.69159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.69177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882421.69182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882421.69190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882421.69197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.69206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.69219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.69224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882421.69231: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882421.69240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.69313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882421.69331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882421.69343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882421.69473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882421.71201: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882421.71301: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882421.71393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp7y1kidpz /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671/AnsiballZ_command.py <<< 12081 1726882421.71492: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882421.72512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882421.72763: stderr chunk (state=3): >>><<< 12081 1726882421.72767: stdout chunk (state=3): >>><<< 12081 1726882421.72769: done transferring module to remote 12081 1726882421.72775: _low_level_execute_command(): starting 12081 1726882421.72777: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671/ /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671/AnsiballZ_command.py && sleep 0' 12081 1726882421.73232: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.73238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.73289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.73293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.73295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.73343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882421.73349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882421.73359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882421.73481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882421.75200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882421.75251: stderr chunk (state=3): >>><<< 12081 1726882421.75257: stdout chunk (state=3): >>><<< 12081 1726882421.75276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882421.75279: _low_level_execute_command(): starting 12081 1726882421.75283: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671/AnsiballZ_command.py && sleep 0' 12081 1726882421.75734: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882421.75738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.75786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.75789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.75792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.75838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882421.75847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882421.75855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882421.75982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882421.89333: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:33:41.888911", "end": "2024-09-20 21:33:41.891899", "delta": "0:00:00.002988", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882421.90405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882421.90470: stderr chunk (state=3): >>><<< 12081 1726882421.90474: stdout chunk (state=3): >>><<< 12081 1726882421.90492: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:33:41.888911", "end": "2024-09-20 21:33:41.891899", "delta": "0:00:00.002988", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882421.90522: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882421.90531: _low_level_execute_command(): starting 12081 1726882421.90533: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882421.6471815-14010-130111486342671/ > /dev/null 2>&1 && sleep 0' 12081 1726882421.91011: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882421.91023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882421.91040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882421.91053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882421.91066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882421.91113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882421.91125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882421.91230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882421.93011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882421.93073: stderr chunk (state=3): >>><<< 12081 1726882421.93078: stdout chunk (state=3): >>><<< 12081 1726882421.93104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882421.93110: handler run complete 12081 1726882421.93129: Evaluated conditional (False): False 12081 1726882421.93139: attempt loop complete, returning result 12081 1726882421.93142: _execute() done 12081 1726882421.93144: dumping result to json 12081 1726882421.93146: done dumping result, returning 12081 1726882421.93157: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-0a3f-ff3c-00000000094e] 12081 1726882421.93164: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000094e 12081 1726882421.93262: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000094e 12081 1726882421.93267: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002988", "end": "2024-09-20 21:33:41.891899", "rc": 0, "start": "2024-09-20 21:33:41.888911" } STDOUT: bonding_masters eth0 lo 12081 1726882421.93339: no more pending results, returning what we have 12081 1726882421.93344: results queue empty 12081 1726882421.93345: checking for any_errors_fatal 12081 1726882421.93346: done checking for any_errors_fatal 12081 1726882421.93346: checking for max_fail_percentage 12081 1726882421.93348: done checking for max_fail_percentage 12081 1726882421.93349: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.93350: done checking to see if all hosts have failed 12081 1726882421.93351: getting the remaining hosts for this loop 12081 1726882421.93355: done getting the remaining hosts for this loop 12081 1726882421.93359: getting the next task for host managed_node3 12081 1726882421.93369: done getting next task for host managed_node3 12081 1726882421.93371: ^ task is: TASK: Set current_interfaces 12081 1726882421.93380: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.93385: getting variables 12081 1726882421.93387: in VariableManager get_vars() 12081 1726882421.93439: Calling all_inventory to load vars for managed_node3 12081 1726882421.93442: Calling groups_inventory to load vars for managed_node3 12081 1726882421.93444: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.93459: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.93462: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.93468: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.94789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882421.95910: done with get_vars() 12081 1726882421.95930: done getting variables 12081 1726882421.95977: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:33:41 -0400 (0:00:00.363) 0:00:41.762 ****** 12081 1726882421.96003: entering _queue_task() for managed_node3/set_fact 12081 1726882421.96237: worker is 1 (out of 1 available) 12081 1726882421.96252: exiting _queue_task() for managed_node3/set_fact 12081 1726882421.96269: done queuing things up, now waiting for results queue to drain 12081 1726882421.96271: waiting for pending results... 12081 1726882421.96461: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 12081 1726882421.96542: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000094f 12081 1726882421.96557: variable 'ansible_search_path' from source: unknown 12081 1726882421.96561: variable 'ansible_search_path' from source: unknown 12081 1726882421.96607: calling self._execute() 12081 1726882421.96704: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.96707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.96717: variable 'omit' from source: magic vars 12081 1726882421.97101: variable 'ansible_distribution_major_version' from source: facts 12081 1726882421.97155: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882421.97158: variable 'omit' from source: magic vars 12081 1726882421.97198: variable 'omit' from source: magic vars 12081 1726882421.97279: variable '_current_interfaces' from source: set_fact 12081 1726882421.97332: variable 'omit' from source: magic vars 12081 1726882421.97370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882421.97395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882421.97413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882421.97426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.97436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882421.97461: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882421.97466: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.97470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.97540: Set connection var ansible_pipelining to False 12081 1726882421.97543: Set connection var ansible_shell_type to sh 12081 1726882421.97548: Set connection var ansible_shell_executable to /bin/sh 12081 1726882421.97551: Set connection var ansible_connection to ssh 12081 1726882421.97555: Set connection var ansible_timeout to 10 12081 1726882421.97565: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882421.97584: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.97588: variable 'ansible_connection' from source: unknown 12081 1726882421.97590: variable 'ansible_module_compression' from source: unknown 12081 1726882421.97592: variable 'ansible_shell_type' from source: unknown 12081 1726882421.97594: variable 'ansible_shell_executable' from source: unknown 12081 1726882421.97596: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882421.97600: variable 'ansible_pipelining' from source: unknown 12081 1726882421.97602: variable 'ansible_timeout' from source: unknown 12081 1726882421.97607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882421.97708: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882421.97717: variable 'omit' from source: magic vars 12081 1726882421.97722: starting attempt loop 12081 1726882421.97726: running the handler 12081 1726882421.97736: handler run complete 12081 1726882421.97744: attempt loop complete, returning result 12081 1726882421.97747: _execute() done 12081 1726882421.97749: dumping result to json 12081 1726882421.97751: done dumping result, returning 12081 1726882421.97760: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-0a3f-ff3c-00000000094f] 12081 1726882421.97766: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000094f 12081 1726882421.97865: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000094f 12081 1726882421.97869: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 12081 1726882421.98011: no more pending results, returning what we have 12081 1726882421.98015: results queue empty 12081 1726882421.98015: checking for any_errors_fatal 12081 1726882421.98027: done checking for any_errors_fatal 12081 1726882421.98028: checking for max_fail_percentage 12081 1726882421.98030: done checking for max_fail_percentage 12081 1726882421.98031: checking to see if all hosts have failed and the running result is not ok 12081 1726882421.98032: done checking to see if all hosts have failed 12081 1726882421.98033: getting the remaining hosts for this loop 12081 1726882421.98035: done getting the remaining hosts for this loop 12081 1726882421.98039: getting the next task for host managed_node3 12081 1726882421.98047: done getting next task for host managed_node3 12081 1726882421.98049: ^ task is: TASK: Show current_interfaces 12081 1726882421.98060: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882421.98066: getting variables 12081 1726882421.98068: in VariableManager get_vars() 12081 1726882421.98106: Calling all_inventory to load vars for managed_node3 12081 1726882421.98109: Calling groups_inventory to load vars for managed_node3 12081 1726882421.98111: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882421.98123: Calling all_plugins_play to load vars for managed_node3 12081 1726882421.98126: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882421.98129: Calling groups_plugins_play to load vars for managed_node3 12081 1726882421.99930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882422.00897: done with get_vars() 12081 1726882422.00916: done getting variables 12081 1726882422.00960: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:33:42 -0400 (0:00:00.049) 0:00:41.812 ****** 12081 1726882422.00986: entering _queue_task() for managed_node3/debug 12081 1726882422.01229: worker is 1 (out of 1 available) 12081 1726882422.01243: exiting _queue_task() for managed_node3/debug 12081 1726882422.01256: done queuing things up, now waiting for results queue to drain 12081 1726882422.01257: waiting for pending results... 12081 1726882422.01449: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 12081 1726882422.01525: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000914 12081 1726882422.01540: variable 'ansible_search_path' from source: unknown 12081 1726882422.01544: variable 'ansible_search_path' from source: unknown 12081 1726882422.01575: calling self._execute() 12081 1726882422.01654: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.01662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.01672: variable 'omit' from source: magic vars 12081 1726882422.01996: variable 'ansible_distribution_major_version' from source: facts 12081 1726882422.02016: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882422.02026: variable 'omit' from source: magic vars 12081 1726882422.02078: variable 'omit' from source: magic vars 12081 1726882422.02176: variable 'current_interfaces' from source: set_fact 12081 1726882422.02208: variable 'omit' from source: magic vars 12081 1726882422.02250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882422.02287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882422.02314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882422.02334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882422.02348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882422.02383: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882422.02390: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.02396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.02494: Set connection var ansible_pipelining to False 12081 1726882422.02502: Set connection var ansible_shell_type to sh 12081 1726882422.02517: Set connection var ansible_shell_executable to /bin/sh 12081 1726882422.02524: Set connection var ansible_connection to ssh 12081 1726882422.02533: Set connection var ansible_timeout to 10 12081 1726882422.02541: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882422.02571: variable 'ansible_shell_executable' from source: unknown 12081 1726882422.02580: variable 'ansible_connection' from source: unknown 12081 1726882422.02588: variable 'ansible_module_compression' from source: unknown 12081 1726882422.02594: variable 'ansible_shell_type' from source: unknown 12081 1726882422.02600: variable 'ansible_shell_executable' from source: unknown 12081 1726882422.02606: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.02613: variable 'ansible_pipelining' from source: unknown 12081 1726882422.02621: variable 'ansible_timeout' from source: unknown 12081 1726882422.02630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.02778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882422.02795: variable 'omit' from source: magic vars 12081 1726882422.02804: starting attempt loop 12081 1726882422.02810: running the handler 12081 1726882422.02868: handler run complete 12081 1726882422.02887: attempt loop complete, returning result 12081 1726882422.02893: _execute() done 12081 1726882422.02899: dumping result to json 12081 1726882422.02905: done dumping result, returning 12081 1726882422.02915: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000914] 12081 1726882422.02925: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000914 12081 1726882422.03027: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000914 12081 1726882422.03033: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 12081 1726882422.03107: no more pending results, returning what we have 12081 1726882422.03111: results queue empty 12081 1726882422.03112: checking for any_errors_fatal 12081 1726882422.03119: done checking for any_errors_fatal 12081 1726882422.03120: checking for max_fail_percentage 12081 1726882422.03122: done checking for max_fail_percentage 12081 1726882422.03123: checking to see if all hosts have failed and the running result is not ok 12081 1726882422.03124: done checking to see if all hosts have failed 12081 1726882422.03125: getting the remaining hosts for this loop 12081 1726882422.03127: done getting the remaining hosts for this loop 12081 1726882422.03131: getting the next task for host managed_node3 12081 1726882422.03139: done getting next task for host managed_node3 12081 1726882422.03143: ^ task is: TASK: Setup 12081 1726882422.03146: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882422.03152: getting variables 12081 1726882422.03154: in VariableManager get_vars() 12081 1726882422.03197: Calling all_inventory to load vars for managed_node3 12081 1726882422.03200: Calling groups_inventory to load vars for managed_node3 12081 1726882422.03203: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882422.03216: Calling all_plugins_play to load vars for managed_node3 12081 1726882422.03220: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882422.03223: Calling groups_plugins_play to load vars for managed_node3 12081 1726882422.04987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882422.06606: done with get_vars() 12081 1726882422.06642: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:33:42 -0400 (0:00:00.057) 0:00:41.870 ****** 12081 1726882422.06738: entering _queue_task() for managed_node3/include_tasks 12081 1726882422.07076: worker is 1 (out of 1 available) 12081 1726882422.07089: exiting _queue_task() for managed_node3/include_tasks 12081 1726882422.07103: done queuing things up, now waiting for results queue to drain 12081 1726882422.07104: waiting for pending results... 12081 1726882422.07389: running TaskExecutor() for managed_node3/TASK: Setup 12081 1726882422.07499: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008ed 12081 1726882422.07517: variable 'ansible_search_path' from source: unknown 12081 1726882422.07522: variable 'ansible_search_path' from source: unknown 12081 1726882422.07576: variable 'lsr_setup' from source: include params 12081 1726882422.07782: variable 'lsr_setup' from source: include params 12081 1726882422.07858: variable 'omit' from source: magic vars 12081 1726882422.07998: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.08010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.08021: variable 'omit' from source: magic vars 12081 1726882422.08252: variable 'ansible_distribution_major_version' from source: facts 12081 1726882422.08269: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882422.08278: variable 'item' from source: unknown 12081 1726882422.08343: variable 'item' from source: unknown 12081 1726882422.08378: variable 'item' from source: unknown 12081 1726882422.08440: variable 'item' from source: unknown 12081 1726882422.08635: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.08649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.08667: variable 'omit' from source: magic vars 12081 1726882422.08831: variable 'ansible_distribution_major_version' from source: facts 12081 1726882422.08842: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882422.08851: variable 'item' from source: unknown 12081 1726882422.08919: variable 'item' from source: unknown 12081 1726882422.08952: variable 'item' from source: unknown 12081 1726882422.09009: variable 'item' from source: unknown 12081 1726882422.09089: dumping result to json 12081 1726882422.09098: done dumping result, returning 12081 1726882422.09105: done running TaskExecutor() for managed_node3/TASK: Setup [0e448fcc-3ce9-0a3f-ff3c-0000000008ed] 12081 1726882422.09116: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ed 12081 1726882422.09185: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ed 12081 1726882422.09193: WORKER PROCESS EXITING 12081 1726882422.09256: no more pending results, returning what we have 12081 1726882422.09261: in VariableManager get_vars() 12081 1726882422.09314: Calling all_inventory to load vars for managed_node3 12081 1726882422.09318: Calling groups_inventory to load vars for managed_node3 12081 1726882422.09320: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882422.09338: Calling all_plugins_play to load vars for managed_node3 12081 1726882422.09342: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882422.09345: Calling groups_plugins_play to load vars for managed_node3 12081 1726882422.11188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882422.12716: done with get_vars() 12081 1726882422.12738: variable 'ansible_search_path' from source: unknown 12081 1726882422.12739: variable 'ansible_search_path' from source: unknown 12081 1726882422.12775: variable 'ansible_search_path' from source: unknown 12081 1726882422.12777: variable 'ansible_search_path' from source: unknown 12081 1726882422.12801: we have included files to process 12081 1726882422.12802: generating all_blocks data 12081 1726882422.12804: done generating all_blocks data 12081 1726882422.12807: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12081 1726882422.12808: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12081 1726882422.12809: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12081 1726882422.13377: done processing included file 12081 1726882422.13379: iterating over new_blocks loaded from include file 12081 1726882422.13380: in VariableManager get_vars() 12081 1726882422.13392: done with get_vars() 12081 1726882422.13393: filtering new block on tags 12081 1726882422.13425: done filtering new block on tags 12081 1726882422.13427: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 12081 1726882422.13430: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12081 1726882422.13431: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12081 1726882422.13433: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12081 1726882422.13492: in VariableManager get_vars() 12081 1726882422.13506: done with get_vars() 12081 1726882422.13510: variable 'item' from source: include params 12081 1726882422.13584: variable 'item' from source: include params 12081 1726882422.13605: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12081 1726882422.13658: in VariableManager get_vars() 12081 1726882422.13675: done with get_vars() 12081 1726882422.13759: in VariableManager get_vars() 12081 1726882422.13774: done with get_vars() 12081 1726882422.13777: variable 'item' from source: include params 12081 1726882422.13819: variable 'item' from source: include params 12081 1726882422.13838: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12081 1726882422.13941: in VariableManager get_vars() 12081 1726882422.13957: done with get_vars() 12081 1726882422.14021: done processing included file 12081 1726882422.14022: iterating over new_blocks loaded from include file 12081 1726882422.14023: in VariableManager get_vars() 12081 1726882422.14033: done with get_vars() 12081 1726882422.14034: filtering new block on tags 12081 1726882422.14083: done filtering new block on tags 12081 1726882422.14085: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node3 => (item=tasks/assert_dhcp_device_present.yml) 12081 1726882422.14088: extending task lists for all hosts with included blocks 12081 1726882422.14532: done extending task lists 12081 1726882422.14533: done processing included files 12081 1726882422.14534: results queue empty 12081 1726882422.14535: checking for any_errors_fatal 12081 1726882422.14539: done checking for any_errors_fatal 12081 1726882422.14540: checking for max_fail_percentage 12081 1726882422.14541: done checking for max_fail_percentage 12081 1726882422.14542: checking to see if all hosts have failed and the running result is not ok 12081 1726882422.14543: done checking to see if all hosts have failed 12081 1726882422.14549: getting the remaining hosts for this loop 12081 1726882422.14550: done getting the remaining hosts for this loop 12081 1726882422.14553: getting the next task for host managed_node3 12081 1726882422.14557: done getting next task for host managed_node3 12081 1726882422.14559: ^ task is: TASK: Install dnsmasq 12081 1726882422.14561: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882422.14565: getting variables 12081 1726882422.14567: in VariableManager get_vars() 12081 1726882422.14577: Calling all_inventory to load vars for managed_node3 12081 1726882422.14579: Calling groups_inventory to load vars for managed_node3 12081 1726882422.14581: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882422.14587: Calling all_plugins_play to load vars for managed_node3 12081 1726882422.14589: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882422.14592: Calling groups_plugins_play to load vars for managed_node3 12081 1726882422.19456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882422.20538: done with get_vars() 12081 1726882422.20561: done getting variables 12081 1726882422.20595: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:42 -0400 (0:00:00.138) 0:00:42.009 ****** 12081 1726882422.20615: entering _queue_task() for managed_node3/package 12081 1726882422.20856: worker is 1 (out of 1 available) 12081 1726882422.20871: exiting _queue_task() for managed_node3/package 12081 1726882422.20884: done queuing things up, now waiting for results queue to drain 12081 1726882422.20886: waiting for pending results... 12081 1726882422.21075: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 12081 1726882422.21154: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000974 12081 1726882422.21171: variable 'ansible_search_path' from source: unknown 12081 1726882422.21178: variable 'ansible_search_path' from source: unknown 12081 1726882422.21205: calling self._execute() 12081 1726882422.21300: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.21304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.21312: variable 'omit' from source: magic vars 12081 1726882422.21596: variable 'ansible_distribution_major_version' from source: facts 12081 1726882422.21609: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882422.21612: variable 'omit' from source: magic vars 12081 1726882422.21679: variable 'omit' from source: magic vars 12081 1726882422.21872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882422.23936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882422.24017: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882422.24067: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882422.24102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882422.24144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882422.24239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882422.24269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882422.24355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882422.24359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882422.24455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882422.24485: variable '__network_is_ostree' from source: set_fact 12081 1726882422.24488: variable 'omit' from source: magic vars 12081 1726882422.24523: variable 'omit' from source: magic vars 12081 1726882422.24570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882422.24603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882422.24625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882422.24642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882422.24653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882422.24698: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882422.24701: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.24704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.24838: Set connection var ansible_pipelining to False 12081 1726882422.24841: Set connection var ansible_shell_type to sh 12081 1726882422.24845: Set connection var ansible_shell_executable to /bin/sh 12081 1726882422.24848: Set connection var ansible_connection to ssh 12081 1726882422.24850: Set connection var ansible_timeout to 10 12081 1726882422.24852: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882422.24883: variable 'ansible_shell_executable' from source: unknown 12081 1726882422.24887: variable 'ansible_connection' from source: unknown 12081 1726882422.24890: variable 'ansible_module_compression' from source: unknown 12081 1726882422.24901: variable 'ansible_shell_type' from source: unknown 12081 1726882422.24907: variable 'ansible_shell_executable' from source: unknown 12081 1726882422.24911: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882422.24917: variable 'ansible_pipelining' from source: unknown 12081 1726882422.24930: variable 'ansible_timeout' from source: unknown 12081 1726882422.24935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882422.25042: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882422.25052: variable 'omit' from source: magic vars 12081 1726882422.25072: starting attempt loop 12081 1726882422.25082: running the handler 12081 1726882422.25085: variable 'ansible_facts' from source: unknown 12081 1726882422.25087: variable 'ansible_facts' from source: unknown 12081 1726882422.25119: _low_level_execute_command(): starting 12081 1726882422.25126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882422.25883: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882422.25915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.25923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.25926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.25962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.25975: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882422.25983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.25997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882422.26004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882422.26023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882422.26026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.26028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.26045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.26048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.26051: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882422.26056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.26130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882422.26163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882422.26167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882422.26380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882422.27983: stdout chunk (state=3): >>>/root <<< 12081 1726882422.28086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882422.28141: stderr chunk (state=3): >>><<< 12081 1726882422.28147: stdout chunk (state=3): >>><<< 12081 1726882422.28174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882422.28187: _low_level_execute_command(): starting 12081 1726882422.28194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127 `" && echo ansible-tmp-1726882422.2817457-14052-213669476025127="` echo /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127 `" ) && sleep 0' 12081 1726882422.28975: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882422.28986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.29000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.29014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.29050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.29057: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882422.29070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.29084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882422.29091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882422.29098: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882422.29108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.29121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.29132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.29139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.29145: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882422.29157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.29227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882422.29242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882422.29249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882422.29394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882422.31291: stdout chunk (state=3): >>>ansible-tmp-1726882422.2817457-14052-213669476025127=/root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127 <<< 12081 1726882422.31478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882422.31482: stdout chunk (state=3): >>><<< 12081 1726882422.31489: stderr chunk (state=3): >>><<< 12081 1726882422.31509: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882422.2817457-14052-213669476025127=/root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882422.31545: variable 'ansible_module_compression' from source: unknown 12081 1726882422.31609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12081 1726882422.31656: variable 'ansible_facts' from source: unknown 12081 1726882422.31759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127/AnsiballZ_dnf.py 12081 1726882422.31900: Sending initial data 12081 1726882422.31904: Sent initial data (152 bytes) 12081 1726882422.32851: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882422.32860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.32873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.32887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.32925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.32932: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882422.32943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.32958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882422.32965: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882422.32978: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882422.32986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.32996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.33007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.33015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.33021: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882422.33029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.33103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882422.33119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882422.33122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882422.33263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882422.35004: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882422.35102: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882422.35202: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpa9yjoz_6 /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127/AnsiballZ_dnf.py <<< 12081 1726882422.35301: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882422.36876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882422.37044: stderr chunk (state=3): >>><<< 12081 1726882422.37047: stdout chunk (state=3): >>><<< 12081 1726882422.37078: done transferring module to remote 12081 1726882422.37090: _low_level_execute_command(): starting 12081 1726882422.37095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127/ /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127/AnsiballZ_dnf.py && sleep 0' 12081 1726882422.37794: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882422.37802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.37814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.37830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.37881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.37889: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882422.37899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.37913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882422.37920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882422.37927: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882422.37934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.37950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.37967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.37976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.37983: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882422.37993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.38075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882422.38094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882422.38107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882422.38238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882422.40077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882422.40081: stdout chunk (state=3): >>><<< 12081 1726882422.40088: stderr chunk (state=3): >>><<< 12081 1726882422.40106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882422.40109: _low_level_execute_command(): starting 12081 1726882422.40114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127/AnsiballZ_dnf.py && sleep 0' 12081 1726882422.40785: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882422.40794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.40804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.40819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.40862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.40872: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882422.40882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.40896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882422.40903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882422.40909: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882422.40917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882422.40926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882422.40937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882422.40944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882422.40951: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882422.40965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882422.41036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882422.41054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882422.41072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882422.41212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882423.41836: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12081 1726882423.47675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882423.47723: stderr chunk (state=3): >>><<< 12081 1726882423.47727: stdout chunk (state=3): >>><<< 12081 1726882423.47742: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882423.47780: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882423.47785: _low_level_execute_command(): starting 12081 1726882423.47790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882422.2817457-14052-213669476025127/ > /dev/null 2>&1 && sleep 0' 12081 1726882423.48221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882423.48227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.48269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.48282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882423.48285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.48330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882423.48342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882423.48445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882423.50275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882423.50322: stderr chunk (state=3): >>><<< 12081 1726882423.50326: stdout chunk (state=3): >>><<< 12081 1726882423.50345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882423.50352: handler run complete 12081 1726882423.50473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882423.50613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882423.50640: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882423.50670: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882423.50691: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882423.50804: variable '__install_status' from source: set_fact 12081 1726882423.50807: Evaluated conditional (__install_status is success): True 12081 1726882423.50809: attempt loop complete, returning result 12081 1726882423.50812: _execute() done 12081 1726882423.50813: dumping result to json 12081 1726882423.50815: done dumping result, returning 12081 1726882423.50817: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0e448fcc-3ce9-0a3f-ff3c-000000000974] 12081 1726882423.50819: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000974 12081 1726882423.50915: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000974 12081 1726882423.50917: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12081 1726882423.51002: no more pending results, returning what we have 12081 1726882423.51007: results queue empty 12081 1726882423.51007: checking for any_errors_fatal 12081 1726882423.51009: done checking for any_errors_fatal 12081 1726882423.51009: checking for max_fail_percentage 12081 1726882423.51011: done checking for max_fail_percentage 12081 1726882423.51012: checking to see if all hosts have failed and the running result is not ok 12081 1726882423.51013: done checking to see if all hosts have failed 12081 1726882423.51014: getting the remaining hosts for this loop 12081 1726882423.51016: done getting the remaining hosts for this loop 12081 1726882423.51020: getting the next task for host managed_node3 12081 1726882423.51027: done getting next task for host managed_node3 12081 1726882423.51030: ^ task is: TASK: Install pgrep, sysctl 12081 1726882423.51033: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882423.51037: getting variables 12081 1726882423.51038: in VariableManager get_vars() 12081 1726882423.51081: Calling all_inventory to load vars for managed_node3 12081 1726882423.51084: Calling groups_inventory to load vars for managed_node3 12081 1726882423.51086: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882423.51097: Calling all_plugins_play to load vars for managed_node3 12081 1726882423.51099: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882423.51101: Calling groups_plugins_play to load vars for managed_node3 12081 1726882423.52750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882423.54260: done with get_vars() 12081 1726882423.54285: done getting variables 12081 1726882423.54329: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:33:43 -0400 (0:00:01.337) 0:00:43.346 ****** 12081 1726882423.54358: entering _queue_task() for managed_node3/package 12081 1726882423.54600: worker is 1 (out of 1 available) 12081 1726882423.54613: exiting _queue_task() for managed_node3/package 12081 1726882423.54626: done queuing things up, now waiting for results queue to drain 12081 1726882423.54627: waiting for pending results... 12081 1726882423.54810: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12081 1726882423.54901: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000975 12081 1726882423.54915: variable 'ansible_search_path' from source: unknown 12081 1726882423.54921: variable 'ansible_search_path' from source: unknown 12081 1726882423.54952: calling self._execute() 12081 1726882423.55038: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882423.55041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882423.55049: variable 'omit' from source: magic vars 12081 1726882423.55352: variable 'ansible_distribution_major_version' from source: facts 12081 1726882423.55365: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882423.55445: variable 'ansible_os_family' from source: facts 12081 1726882423.55449: Evaluated conditional (ansible_os_family == 'RedHat'): True 12081 1726882423.55580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882423.55783: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882423.55814: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882423.55841: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882423.55870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882423.55926: variable 'ansible_distribution_major_version' from source: facts 12081 1726882423.55937: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 12081 1726882423.55940: when evaluation is False, skipping this task 12081 1726882423.55943: _execute() done 12081 1726882423.55945: dumping result to json 12081 1726882423.55947: done dumping result, returning 12081 1726882423.55953: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0e448fcc-3ce9-0a3f-ff3c-000000000975] 12081 1726882423.55962: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000975 12081 1726882423.56050: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000975 12081 1726882423.56052: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 12081 1726882423.56104: no more pending results, returning what we have 12081 1726882423.56108: results queue empty 12081 1726882423.56109: checking for any_errors_fatal 12081 1726882423.56117: done checking for any_errors_fatal 12081 1726882423.56118: checking for max_fail_percentage 12081 1726882423.56119: done checking for max_fail_percentage 12081 1726882423.56120: checking to see if all hosts have failed and the running result is not ok 12081 1726882423.56121: done checking to see if all hosts have failed 12081 1726882423.56122: getting the remaining hosts for this loop 12081 1726882423.56124: done getting the remaining hosts for this loop 12081 1726882423.56128: getting the next task for host managed_node3 12081 1726882423.56134: done getting next task for host managed_node3 12081 1726882423.56136: ^ task is: TASK: Install pgrep, sysctl 12081 1726882423.56140: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882423.56146: getting variables 12081 1726882423.56147: in VariableManager get_vars() 12081 1726882423.56192: Calling all_inventory to load vars for managed_node3 12081 1726882423.56194: Calling groups_inventory to load vars for managed_node3 12081 1726882423.56196: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882423.56206: Calling all_plugins_play to load vars for managed_node3 12081 1726882423.56209: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882423.56211: Calling groups_plugins_play to load vars for managed_node3 12081 1726882423.57666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882423.59347: done with get_vars() 12081 1726882423.59376: done getting variables 12081 1726882423.59435: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:33:43 -0400 (0:00:00.051) 0:00:43.397 ****** 12081 1726882423.59469: entering _queue_task() for managed_node3/package 12081 1726882423.59798: worker is 1 (out of 1 available) 12081 1726882423.59811: exiting _queue_task() for managed_node3/package 12081 1726882423.59824: done queuing things up, now waiting for results queue to drain 12081 1726882423.59826: waiting for pending results... 12081 1726882423.60120: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12081 1726882423.60245: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000976 12081 1726882423.60272: variable 'ansible_search_path' from source: unknown 12081 1726882423.60281: variable 'ansible_search_path' from source: unknown 12081 1726882423.60322: calling self._execute() 12081 1726882423.60437: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882423.60448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882423.60460: variable 'omit' from source: magic vars 12081 1726882423.60842: variable 'ansible_distribution_major_version' from source: facts 12081 1726882423.60860: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882423.60984: variable 'ansible_os_family' from source: facts 12081 1726882423.60994: Evaluated conditional (ansible_os_family == 'RedHat'): True 12081 1726882423.61174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882423.61474: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882423.61520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882423.61560: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882423.61603: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882423.61687: variable 'ansible_distribution_major_version' from source: facts 12081 1726882423.61705: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 12081 1726882423.61715: variable 'omit' from source: magic vars 12081 1726882423.61763: variable 'omit' from source: magic vars 12081 1726882423.61974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882423.64266: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882423.64337: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882423.64377: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882423.64416: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882423.64442: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882423.64536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882423.64582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882423.64617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882423.64662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882423.64684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882423.64792: variable '__network_is_ostree' from source: set_fact 12081 1726882423.64802: variable 'omit' from source: magic vars 12081 1726882423.64839: variable 'omit' from source: magic vars 12081 1726882423.64874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882423.64904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882423.64927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882423.64951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882423.64967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882423.65000: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882423.65007: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882423.65014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882423.65120: Set connection var ansible_pipelining to False 12081 1726882423.65128: Set connection var ansible_shell_type to sh 12081 1726882423.65139: Set connection var ansible_shell_executable to /bin/sh 12081 1726882423.65146: Set connection var ansible_connection to ssh 12081 1726882423.65158: Set connection var ansible_timeout to 10 12081 1726882423.65170: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882423.65200: variable 'ansible_shell_executable' from source: unknown 12081 1726882423.65207: variable 'ansible_connection' from source: unknown 12081 1726882423.65213: variable 'ansible_module_compression' from source: unknown 12081 1726882423.65219: variable 'ansible_shell_type' from source: unknown 12081 1726882423.65224: variable 'ansible_shell_executable' from source: unknown 12081 1726882423.65230: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882423.65236: variable 'ansible_pipelining' from source: unknown 12081 1726882423.65241: variable 'ansible_timeout' from source: unknown 12081 1726882423.65247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882423.65346: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882423.65364: variable 'omit' from source: magic vars 12081 1726882423.65378: starting attempt loop 12081 1726882423.65386: running the handler 12081 1726882423.65396: variable 'ansible_facts' from source: unknown 12081 1726882423.65402: variable 'ansible_facts' from source: unknown 12081 1726882423.65437: _low_level_execute_command(): starting 12081 1726882423.65448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882423.66187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882423.66202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882423.66216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882423.66236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.66284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882423.66296: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882423.66309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.66326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882423.66348: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882423.66361: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882423.66375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882423.66388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882423.66402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.66412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882423.66422: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882423.66434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.66512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882423.66529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882423.66542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882423.66692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882423.68388: stdout chunk (state=3): >>>/root <<< 12081 1726882423.68489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882423.68585: stderr chunk (state=3): >>><<< 12081 1726882423.68588: stdout chunk (state=3): >>><<< 12081 1726882423.68705: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882423.68709: _low_level_execute_command(): starting 12081 1726882423.68711: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400 `" && echo ansible-tmp-1726882423.6860917-14123-62504970899400="` echo /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400 `" ) && sleep 0' 12081 1726882423.69319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882423.69334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882423.69348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882423.69373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.69415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882423.69426: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882423.69439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.69455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882423.69473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882423.69485: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882423.69497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882423.69511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882423.69529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.69541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882423.69553: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882423.69572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.69644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882423.69662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882423.69681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882423.69821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882423.71718: stdout chunk (state=3): >>>ansible-tmp-1726882423.6860917-14123-62504970899400=/root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400 <<< 12081 1726882423.71893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882423.71897: stdout chunk (state=3): >>><<< 12081 1726882423.71903: stderr chunk (state=3): >>><<< 12081 1726882423.71921: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882423.6860917-14123-62504970899400=/root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882423.71955: variable 'ansible_module_compression' from source: unknown 12081 1726882423.72024: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12081 1726882423.72076: variable 'ansible_facts' from source: unknown 12081 1726882423.72188: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400/AnsiballZ_dnf.py 12081 1726882423.72337: Sending initial data 12081 1726882423.72347: Sent initial data (151 bytes) 12081 1726882423.73009: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882423.73017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.73069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.73077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882423.73080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882423.73082: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.73128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882423.73142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882423.73254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882423.75064: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12081 1726882423.75069: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12081 1726882423.75072: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12081 1726882423.75074: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 12081 1726882423.75076: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 12081 1726882423.75078: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 12081 1726882423.75080: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 12081 1726882423.75082: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 12081 1726882423.75084: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882423.75152: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 12081 1726882423.75168: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 12081 1726882423.75179: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 12081 1726882423.75298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpb8xn7kpu /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400/AnsiballZ_dnf.py <<< 12081 1726882423.75395: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882423.76860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882423.77059: stderr chunk (state=3): >>><<< 12081 1726882423.77066: stdout chunk (state=3): >>><<< 12081 1726882423.77069: done transferring module to remote 12081 1726882423.77072: _low_level_execute_command(): starting 12081 1726882423.77076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400/ /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400/AnsiballZ_dnf.py && sleep 0' 12081 1726882423.77451: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882423.77458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.77489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.77492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882423.77494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.77551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882423.77561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882423.77562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882423.77661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882423.79395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882423.79442: stderr chunk (state=3): >>><<< 12081 1726882423.79444: stdout chunk (state=3): >>><<< 12081 1726882423.79465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882423.79469: _low_level_execute_command(): starting 12081 1726882423.79476: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400/AnsiballZ_dnf.py && sleep 0' 12081 1726882423.79922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882423.79928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882423.79971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.79985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882423.79988: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882423.80038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882423.80041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882423.80047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882423.80152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882424.81329: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12081 1726882424.87257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882424.87260: stdout chunk (state=3): >>><<< 12081 1726882424.87263: stderr chunk (state=3): >>><<< 12081 1726882424.87418: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882424.87422: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882424.87425: _low_level_execute_command(): starting 12081 1726882424.87427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882423.6860917-14123-62504970899400/ > /dev/null 2>&1 && sleep 0' 12081 1726882424.88041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882424.88054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882424.88078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882424.88097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882424.88138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882424.88150: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882424.88166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882424.88193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882424.88205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882424.88216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882424.88226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882424.88238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882424.88252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882424.88266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882424.88278: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882424.88292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882424.88373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882424.88394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882424.88417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882424.88552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882424.90476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882424.90480: stdout chunk (state=3): >>><<< 12081 1726882424.90482: stderr chunk (state=3): >>><<< 12081 1726882424.90872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882424.90876: handler run complete 12081 1726882424.90879: attempt loop complete, returning result 12081 1726882424.90881: _execute() done 12081 1726882424.90883: dumping result to json 12081 1726882424.90885: done dumping result, returning 12081 1726882424.90888: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0e448fcc-3ce9-0a3f-ff3c-000000000976] 12081 1726882424.90894: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000976 12081 1726882424.90969: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000976 12081 1726882424.90972: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12081 1726882424.91051: no more pending results, returning what we have 12081 1726882424.91054: results queue empty 12081 1726882424.91055: checking for any_errors_fatal 12081 1726882424.91060: done checking for any_errors_fatal 12081 1726882424.91061: checking for max_fail_percentage 12081 1726882424.91066: done checking for max_fail_percentage 12081 1726882424.91067: checking to see if all hosts have failed and the running result is not ok 12081 1726882424.91068: done checking to see if all hosts have failed 12081 1726882424.91069: getting the remaining hosts for this loop 12081 1726882424.91070: done getting the remaining hosts for this loop 12081 1726882424.91074: getting the next task for host managed_node3 12081 1726882424.91080: done getting next task for host managed_node3 12081 1726882424.91083: ^ task is: TASK: Create test interfaces 12081 1726882424.91086: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882424.91090: getting variables 12081 1726882424.91092: in VariableManager get_vars() 12081 1726882424.91126: Calling all_inventory to load vars for managed_node3 12081 1726882424.91129: Calling groups_inventory to load vars for managed_node3 12081 1726882424.91132: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882424.91142: Calling all_plugins_play to load vars for managed_node3 12081 1726882424.91145: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882424.91149: Calling groups_plugins_play to load vars for managed_node3 12081 1726882424.92620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882424.93710: done with get_vars() 12081 1726882424.93727: done getting variables 12081 1726882424.93774: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:33:44 -0400 (0:00:01.343) 0:00:44.740 ****** 12081 1726882424.93797: entering _queue_task() for managed_node3/shell 12081 1726882424.94037: worker is 1 (out of 1 available) 12081 1726882424.94050: exiting _queue_task() for managed_node3/shell 12081 1726882424.94065: done queuing things up, now waiting for results queue to drain 12081 1726882424.94066: waiting for pending results... 12081 1726882424.94260: running TaskExecutor() for managed_node3/TASK: Create test interfaces 12081 1726882424.94339: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000977 12081 1726882424.94350: variable 'ansible_search_path' from source: unknown 12081 1726882424.94353: variable 'ansible_search_path' from source: unknown 12081 1726882424.94403: calling self._execute() 12081 1726882424.94522: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882424.94534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882424.94544: variable 'omit' from source: magic vars 12081 1726882424.94959: variable 'ansible_distribution_major_version' from source: facts 12081 1726882424.94979: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882424.94992: variable 'omit' from source: magic vars 12081 1726882424.95046: variable 'omit' from source: magic vars 12081 1726882424.95478: variable 'dhcp_interface1' from source: play vars 12081 1726882424.95491: variable 'dhcp_interface2' from source: play vars 12081 1726882424.95517: variable 'omit' from source: magic vars 12081 1726882424.95571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882424.95613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882424.95638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882424.95667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882424.95685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882424.95734: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882424.95749: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882424.95753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882424.95839: Set connection var ansible_pipelining to False 12081 1726882424.95842: Set connection var ansible_shell_type to sh 12081 1726882424.95847: Set connection var ansible_shell_executable to /bin/sh 12081 1726882424.95850: Set connection var ansible_connection to ssh 12081 1726882424.95882: Set connection var ansible_timeout to 10 12081 1726882424.95885: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882424.95898: variable 'ansible_shell_executable' from source: unknown 12081 1726882424.95908: variable 'ansible_connection' from source: unknown 12081 1726882424.95913: variable 'ansible_module_compression' from source: unknown 12081 1726882424.95916: variable 'ansible_shell_type' from source: unknown 12081 1726882424.95920: variable 'ansible_shell_executable' from source: unknown 12081 1726882424.95922: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882424.95924: variable 'ansible_pipelining' from source: unknown 12081 1726882424.95926: variable 'ansible_timeout' from source: unknown 12081 1726882424.95931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882424.96038: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882424.96046: variable 'omit' from source: magic vars 12081 1726882424.96057: starting attempt loop 12081 1726882424.96060: running the handler 12081 1726882424.96071: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882424.96088: _low_level_execute_command(): starting 12081 1726882424.96097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882424.96612: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882424.96621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882424.96649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882424.96663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882424.96678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882424.96721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882424.96734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882424.96840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882424.98436: stdout chunk (state=3): >>>/root <<< 12081 1726882424.98606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882424.98616: stdout chunk (state=3): >>><<< 12081 1726882424.98629: stderr chunk (state=3): >>><<< 12081 1726882424.98654: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882424.98684: _low_level_execute_command(): starting 12081 1726882424.98697: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780 `" && echo ansible-tmp-1726882424.9866695-14153-31304754641780="` echo /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780 `" ) && sleep 0' 12081 1726882424.99383: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882424.99406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882424.99426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882424.99447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882424.99497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882424.99510: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882424.99524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882424.99549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882424.99565: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882424.99580: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882424.99592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882424.99606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882424.99622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882424.99635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882424.99653: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882424.99672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882424.99750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882424.99779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882424.99798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882424.99933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882425.01796: stdout chunk (state=3): >>>ansible-tmp-1726882424.9866695-14153-31304754641780=/root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780 <<< 12081 1726882425.01905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882425.01985: stderr chunk (state=3): >>><<< 12081 1726882425.01988: stdout chunk (state=3): >>><<< 12081 1726882425.02274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882424.9866695-14153-31304754641780=/root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882425.02278: variable 'ansible_module_compression' from source: unknown 12081 1726882425.02280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882425.02282: variable 'ansible_facts' from source: unknown 12081 1726882425.02284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780/AnsiballZ_command.py 12081 1726882425.02396: Sending initial data 12081 1726882425.02399: Sent initial data (155 bytes) 12081 1726882425.03427: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882425.03442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882425.03458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882425.03487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882425.03532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882425.03545: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882425.03560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882425.03581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882425.03603: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882425.03615: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882425.03627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882425.03640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882425.03657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882425.03672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882425.03684: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882425.03698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882425.03782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882425.03804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882425.03830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882425.03958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882425.05706: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882425.05799: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882425.05901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp31pskzot /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780/AnsiballZ_command.py <<< 12081 1726882425.05995: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882425.07285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882425.07572: stderr chunk (state=3): >>><<< 12081 1726882425.07575: stdout chunk (state=3): >>><<< 12081 1726882425.07578: done transferring module to remote 12081 1726882425.07580: _low_level_execute_command(): starting 12081 1726882425.07582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780/ /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780/AnsiballZ_command.py && sleep 0' 12081 1726882425.08210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882425.08231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882425.08250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882425.08272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882425.08314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882425.08326: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882425.08350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882425.08370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882425.08381: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882425.08391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882425.08402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882425.08416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882425.08432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882425.08442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882425.08460: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882425.08477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882425.08553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882425.08584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882425.08601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882425.08727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882425.10476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882425.10562: stderr chunk (state=3): >>><<< 12081 1726882425.10578: stdout chunk (state=3): >>><<< 12081 1726882425.10689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882425.10692: _low_level_execute_command(): starting 12081 1726882425.10695: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780/AnsiballZ_command.py && sleep 0' 12081 1726882425.11289: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882425.11302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882425.11316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882425.11334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882425.11382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882425.11393: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882425.11406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882425.11423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882425.11434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882425.11446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882425.11468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882425.11482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882425.11498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882425.11509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882425.11520: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882425.11533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882425.11608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882425.11630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882425.11645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882425.11790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882426.45686: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 12081 1726882426.45711: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:45.246572", "end": "2024-09-20 21:33:46.454883", "delta": "0:00:01.208311", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882426.47217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882426.47221: stdout chunk (state=3): >>><<< 12081 1726882426.47223: stderr chunk (state=3): >>><<< 12081 1726882426.47396: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:33:45.246572", "end": "2024-09-20 21:33:46.454883", "delta": "0:00:01.208311", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882426.47408: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882426.47411: _low_level_execute_command(): starting 12081 1726882426.47413: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882424.9866695-14153-31304754641780/ > /dev/null 2>&1 && sleep 0' 12081 1726882426.48007: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882426.48021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.48034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.48054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.48099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.48110: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882426.48123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.48138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882426.48152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882426.48166: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882426.48178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.48191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.48204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.48214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.48224: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882426.48235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.48316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882426.48337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882426.48351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882426.48483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882426.50430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882426.50433: stdout chunk (state=3): >>><<< 12081 1726882426.50436: stderr chunk (state=3): >>><<< 12081 1726882426.50672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882426.50675: handler run complete 12081 1726882426.50677: Evaluated conditional (False): False 12081 1726882426.50680: attempt loop complete, returning result 12081 1726882426.50682: _execute() done 12081 1726882426.50687: dumping result to json 12081 1726882426.50689: done dumping result, returning 12081 1726882426.50691: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000977] 12081 1726882426.50693: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000977 12081 1726882426.50769: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000977 12081 1726882426.50772: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.208311", "end": "2024-09-20 21:33:46.454883", "rc": 0, "start": "2024-09-20 21:33:45.246572" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 619 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 619 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 12081 1726882426.50853: no more pending results, returning what we have 12081 1726882426.50857: results queue empty 12081 1726882426.50858: checking for any_errors_fatal 12081 1726882426.50869: done checking for any_errors_fatal 12081 1726882426.50870: checking for max_fail_percentage 12081 1726882426.50872: done checking for max_fail_percentage 12081 1726882426.50873: checking to see if all hosts have failed and the running result is not ok 12081 1726882426.50874: done checking to see if all hosts have failed 12081 1726882426.50875: getting the remaining hosts for this loop 12081 1726882426.50877: done getting the remaining hosts for this loop 12081 1726882426.50881: getting the next task for host managed_node3 12081 1726882426.50892: done getting next task for host managed_node3 12081 1726882426.50895: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12081 1726882426.50900: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882426.50904: getting variables 12081 1726882426.50906: in VariableManager get_vars() 12081 1726882426.50944: Calling all_inventory to load vars for managed_node3 12081 1726882426.50947: Calling groups_inventory to load vars for managed_node3 12081 1726882426.50949: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882426.50961: Calling all_plugins_play to load vars for managed_node3 12081 1726882426.50966: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882426.50969: Calling groups_plugins_play to load vars for managed_node3 12081 1726882426.52572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882426.54961: done with get_vars() 12081 1726882426.54993: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:46 -0400 (0:00:01.612) 0:00:46.353 ****** 12081 1726882426.55099: entering _queue_task() for managed_node3/include_tasks 12081 1726882426.56092: worker is 1 (out of 1 available) 12081 1726882426.56106: exiting _queue_task() for managed_node3/include_tasks 12081 1726882426.56118: done queuing things up, now waiting for results queue to drain 12081 1726882426.56119: waiting for pending results... 12081 1726882426.56412: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12081 1726882426.56569: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000097e 12081 1726882426.56588: variable 'ansible_search_path' from source: unknown 12081 1726882426.56594: variable 'ansible_search_path' from source: unknown 12081 1726882426.56635: calling self._execute() 12081 1726882426.56744: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882426.56760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882426.56776: variable 'omit' from source: magic vars 12081 1726882426.57147: variable 'ansible_distribution_major_version' from source: facts 12081 1726882426.57166: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882426.57177: _execute() done 12081 1726882426.57184: dumping result to json 12081 1726882426.57190: done dumping result, returning 12081 1726882426.57200: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-00000000097e] 12081 1726882426.57214: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000097e 12081 1726882426.57321: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000097e 12081 1726882426.57351: no more pending results, returning what we have 12081 1726882426.57356: in VariableManager get_vars() 12081 1726882426.57403: Calling all_inventory to load vars for managed_node3 12081 1726882426.57406: Calling groups_inventory to load vars for managed_node3 12081 1726882426.57408: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882426.57423: Calling all_plugins_play to load vars for managed_node3 12081 1726882426.57426: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882426.57429: Calling groups_plugins_play to load vars for managed_node3 12081 1726882426.60125: WORKER PROCESS EXITING 12081 1726882426.60574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882426.62521: done with get_vars() 12081 1726882426.62549: variable 'ansible_search_path' from source: unknown 12081 1726882426.62551: variable 'ansible_search_path' from source: unknown 12081 1726882426.62593: we have included files to process 12081 1726882426.62594: generating all_blocks data 12081 1726882426.62596: done generating all_blocks data 12081 1726882426.62603: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882426.62605: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882426.62607: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882426.62842: done processing included file 12081 1726882426.62844: iterating over new_blocks loaded from include file 12081 1726882426.62846: in VariableManager get_vars() 12081 1726882426.62871: done with get_vars() 12081 1726882426.62873: filtering new block on tags 12081 1726882426.62905: done filtering new block on tags 12081 1726882426.62907: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12081 1726882426.62913: extending task lists for all hosts with included blocks 12081 1726882426.63132: done extending task lists 12081 1726882426.63133: done processing included files 12081 1726882426.63134: results queue empty 12081 1726882426.63135: checking for any_errors_fatal 12081 1726882426.63141: done checking for any_errors_fatal 12081 1726882426.63142: checking for max_fail_percentage 12081 1726882426.63143: done checking for max_fail_percentage 12081 1726882426.63144: checking to see if all hosts have failed and the running result is not ok 12081 1726882426.63145: done checking to see if all hosts have failed 12081 1726882426.63145: getting the remaining hosts for this loop 12081 1726882426.63147: done getting the remaining hosts for this loop 12081 1726882426.63149: getting the next task for host managed_node3 12081 1726882426.63153: done getting next task for host managed_node3 12081 1726882426.63155: ^ task is: TASK: Get stat for interface {{ interface }} 12081 1726882426.63159: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882426.63162: getting variables 12081 1726882426.63162: in VariableManager get_vars() 12081 1726882426.63176: Calling all_inventory to load vars for managed_node3 12081 1726882426.63178: Calling groups_inventory to load vars for managed_node3 12081 1726882426.63180: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882426.63186: Calling all_plugins_play to load vars for managed_node3 12081 1726882426.63188: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882426.63191: Calling groups_plugins_play to load vars for managed_node3 12081 1726882426.65010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882426.70548: done with get_vars() 12081 1726882426.70573: done getting variables 12081 1726882426.70735: variable 'interface' from source: task vars 12081 1726882426.70739: variable 'dhcp_interface1' from source: play vars 12081 1726882426.70797: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:46 -0400 (0:00:00.157) 0:00:46.511 ****** 12081 1726882426.70832: entering _queue_task() for managed_node3/stat 12081 1726882426.71880: worker is 1 (out of 1 available) 12081 1726882426.71893: exiting _queue_task() for managed_node3/stat 12081 1726882426.71905: done queuing things up, now waiting for results queue to drain 12081 1726882426.71907: waiting for pending results... 12081 1726882426.72662: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 12081 1726882426.73023: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000009dd 12081 1726882426.73179: variable 'ansible_search_path' from source: unknown 12081 1726882426.73188: variable 'ansible_search_path' from source: unknown 12081 1726882426.73232: calling self._execute() 12081 1726882426.73340: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882426.73357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882426.73375: variable 'omit' from source: magic vars 12081 1726882426.73771: variable 'ansible_distribution_major_version' from source: facts 12081 1726882426.73796: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882426.73807: variable 'omit' from source: magic vars 12081 1726882426.73888: variable 'omit' from source: magic vars 12081 1726882426.73995: variable 'interface' from source: task vars 12081 1726882426.74011: variable 'dhcp_interface1' from source: play vars 12081 1726882426.74104: variable 'dhcp_interface1' from source: play vars 12081 1726882426.74131: variable 'omit' from source: magic vars 12081 1726882426.74185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882426.74229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882426.74256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882426.74281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882426.74309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882426.74358: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882426.74371: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882426.74380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882426.74513: Set connection var ansible_pipelining to False 12081 1726882426.74521: Set connection var ansible_shell_type to sh 12081 1726882426.74534: Set connection var ansible_shell_executable to /bin/sh 12081 1726882426.74541: Set connection var ansible_connection to ssh 12081 1726882426.74559: Set connection var ansible_timeout to 10 12081 1726882426.74573: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882426.74610: variable 'ansible_shell_executable' from source: unknown 12081 1726882426.74618: variable 'ansible_connection' from source: unknown 12081 1726882426.74626: variable 'ansible_module_compression' from source: unknown 12081 1726882426.74634: variable 'ansible_shell_type' from source: unknown 12081 1726882426.74643: variable 'ansible_shell_executable' from source: unknown 12081 1726882426.74649: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882426.74667: variable 'ansible_pipelining' from source: unknown 12081 1726882426.74682: variable 'ansible_timeout' from source: unknown 12081 1726882426.74701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882426.74924: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882426.74941: variable 'omit' from source: magic vars 12081 1726882426.74951: starting attempt loop 12081 1726882426.74962: running the handler 12081 1726882426.74986: _low_level_execute_command(): starting 12081 1726882426.75003: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882426.75803: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882426.75819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.75835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.75860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.75911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.75925: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882426.75939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.75960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882426.75977: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882426.75991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882426.76007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.76021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.76036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.76050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.76068: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882426.76083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.76168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882426.76193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882426.76213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882426.76355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882426.78022: stdout chunk (state=3): >>>/root <<< 12081 1726882426.78180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882426.78234: stderr chunk (state=3): >>><<< 12081 1726882426.78237: stdout chunk (state=3): >>><<< 12081 1726882426.78361: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882426.78367: _low_level_execute_command(): starting 12081 1726882426.78370: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113 `" && echo ansible-tmp-1726882426.7826476-14215-125832109960113="` echo /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113 `" ) && sleep 0' 12081 1726882426.79837: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.79841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.79884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.79887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.79890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.80067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882426.80070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882426.80198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882426.82086: stdout chunk (state=3): >>>ansible-tmp-1726882426.7826476-14215-125832109960113=/root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113 <<< 12081 1726882426.82200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882426.82289: stderr chunk (state=3): >>><<< 12081 1726882426.82293: stdout chunk (state=3): >>><<< 12081 1726882426.82370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882426.7826476-14215-125832109960113=/root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882426.82572: variable 'ansible_module_compression' from source: unknown 12081 1726882426.82575: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882426.82578: variable 'ansible_facts' from source: unknown 12081 1726882426.82580: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113/AnsiballZ_stat.py 12081 1726882426.83233: Sending initial data 12081 1726882426.83237: Sent initial data (153 bytes) 12081 1726882426.85880: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882426.85997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.86015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.86029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.86075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.86083: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882426.86098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.86114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882426.86118: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882426.86125: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882426.86133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.86142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.86379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.86383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.86385: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882426.86387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.86389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882426.86393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882426.86395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882426.86499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882426.88272: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882426.88370: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882426.88474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpl9l143x8 /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113/AnsiballZ_stat.py <<< 12081 1726882426.88578: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882426.89978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882426.89982: stderr chunk (state=3): >>><<< 12081 1726882426.89987: stdout chunk (state=3): >>><<< 12081 1726882426.90009: done transferring module to remote 12081 1726882426.90020: _low_level_execute_command(): starting 12081 1726882426.90025: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113/ /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113/AnsiballZ_stat.py && sleep 0' 12081 1726882426.91629: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882426.91761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.91780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.91791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.91830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.91837: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882426.91849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.91870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882426.91881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882426.91889: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882426.91897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.91906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.91918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.91926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882426.91932: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882426.91942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.92029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882426.92201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882426.92209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882426.92412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882426.94241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882426.94245: stdout chunk (state=3): >>><<< 12081 1726882426.94250: stderr chunk (state=3): >>><<< 12081 1726882426.94272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882426.94275: _low_level_execute_command(): starting 12081 1726882426.94280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113/AnsiballZ_stat.py && sleep 0' 12081 1726882426.95691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.95695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882426.95740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.95746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882426.95762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882426.95771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882426.95849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882426.95871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882426.96000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.09225: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28179, "dev": 21, "nlink": 1, "atime": 1726882425.2544837, "mtime": 1726882425.2544837, "ctime": 1726882425.2544837, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882427.10233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882427.10237: stderr chunk (state=3): >>><<< 12081 1726882427.10240: stdout chunk (state=3): >>><<< 12081 1726882427.10265: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28179, "dev": 21, "nlink": 1, "atime": 1726882425.2544837, "mtime": 1726882425.2544837, "ctime": 1726882425.2544837, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882427.10325: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882427.10335: _low_level_execute_command(): starting 12081 1726882427.10338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882426.7826476-14215-125832109960113/ > /dev/null 2>&1 && sleep 0' 12081 1726882427.12046: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.12050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.12134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.12138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.12202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.12206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882427.12218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.12341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882427.12419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882427.12422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882427.12552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.14435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882427.14440: stderr chunk (state=3): >>><<< 12081 1726882427.14442: stdout chunk (state=3): >>><<< 12081 1726882427.14466: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882427.14473: handler run complete 12081 1726882427.14526: attempt loop complete, returning result 12081 1726882427.14530: _execute() done 12081 1726882427.14532: dumping result to json 12081 1726882427.14537: done dumping result, returning 12081 1726882427.14545: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0e448fcc-3ce9-0a3f-ff3c-0000000009dd] 12081 1726882427.14553: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000009dd 12081 1726882427.14680: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000009dd 12081 1726882427.14683: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882425.2544837, "block_size": 4096, "blocks": 0, "ctime": 1726882425.2544837, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28179, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882425.2544837, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12081 1726882427.14774: no more pending results, returning what we have 12081 1726882427.14778: results queue empty 12081 1726882427.14779: checking for any_errors_fatal 12081 1726882427.14780: done checking for any_errors_fatal 12081 1726882427.14781: checking for max_fail_percentage 12081 1726882427.14783: done checking for max_fail_percentage 12081 1726882427.14784: checking to see if all hosts have failed and the running result is not ok 12081 1726882427.14784: done checking to see if all hosts have failed 12081 1726882427.14785: getting the remaining hosts for this loop 12081 1726882427.14787: done getting the remaining hosts for this loop 12081 1726882427.14790: getting the next task for host managed_node3 12081 1726882427.14800: done getting next task for host managed_node3 12081 1726882427.14802: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12081 1726882427.14807: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882427.14812: getting variables 12081 1726882427.14814: in VariableManager get_vars() 12081 1726882427.14848: Calling all_inventory to load vars for managed_node3 12081 1726882427.14856: Calling groups_inventory to load vars for managed_node3 12081 1726882427.14858: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882427.14871: Calling all_plugins_play to load vars for managed_node3 12081 1726882427.14874: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882427.14877: Calling groups_plugins_play to load vars for managed_node3 12081 1726882427.17315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882427.21055: done with get_vars() 12081 1726882427.21092: done getting variables 12081 1726882427.21271: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882427.21509: variable 'interface' from source: task vars 12081 1726882427.21513: variable 'dhcp_interface1' from source: play vars 12081 1726882427.21682: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:47 -0400 (0:00:00.508) 0:00:47.020 ****** 12081 1726882427.21715: entering _queue_task() for managed_node3/assert 12081 1726882427.22399: worker is 1 (out of 1 available) 12081 1726882427.22412: exiting _queue_task() for managed_node3/assert 12081 1726882427.22537: done queuing things up, now waiting for results queue to drain 12081 1726882427.22540: waiting for pending results... 12081 1726882427.23158: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 12081 1726882427.23504: in run() - task 0e448fcc-3ce9-0a3f-ff3c-00000000097f 12081 1726882427.23787: variable 'ansible_search_path' from source: unknown 12081 1726882427.23794: variable 'ansible_search_path' from source: unknown 12081 1726882427.23835: calling self._execute() 12081 1726882427.23938: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.23950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.24180: variable 'omit' from source: magic vars 12081 1726882427.24557: variable 'ansible_distribution_major_version' from source: facts 12081 1726882427.24784: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882427.24796: variable 'omit' from source: magic vars 12081 1726882427.24857: variable 'omit' from source: magic vars 12081 1726882427.25171: variable 'interface' from source: task vars 12081 1726882427.25183: variable 'dhcp_interface1' from source: play vars 12081 1726882427.25249: variable 'dhcp_interface1' from source: play vars 12081 1726882427.25290: variable 'omit' from source: magic vars 12081 1726882427.25523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882427.25567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882427.25592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882427.25615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882427.25632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882427.25672: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882427.25876: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.25885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.25995: Set connection var ansible_pipelining to False 12081 1726882427.26004: Set connection var ansible_shell_type to sh 12081 1726882427.26017: Set connection var ansible_shell_executable to /bin/sh 12081 1726882427.26024: Set connection var ansible_connection to ssh 12081 1726882427.26034: Set connection var ansible_timeout to 10 12081 1726882427.26044: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882427.26079: variable 'ansible_shell_executable' from source: unknown 12081 1726882427.26087: variable 'ansible_connection' from source: unknown 12081 1726882427.26093: variable 'ansible_module_compression' from source: unknown 12081 1726882427.26100: variable 'ansible_shell_type' from source: unknown 12081 1726882427.26106: variable 'ansible_shell_executable' from source: unknown 12081 1726882427.26275: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.26284: variable 'ansible_pipelining' from source: unknown 12081 1726882427.26291: variable 'ansible_timeout' from source: unknown 12081 1726882427.26298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.26437: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882427.26458: variable 'omit' from source: magic vars 12081 1726882427.26485: starting attempt loop 12081 1726882427.26674: running the handler 12081 1726882427.26822: variable 'interface_stat' from source: set_fact 12081 1726882427.26848: Evaluated conditional (interface_stat.stat.exists): True 12081 1726882427.26862: handler run complete 12081 1726882427.27587: attempt loop complete, returning result 12081 1726882427.27595: _execute() done 12081 1726882427.27601: dumping result to json 12081 1726882427.27608: done dumping result, returning 12081 1726882427.27619: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0e448fcc-3ce9-0a3f-ff3c-00000000097f] 12081 1726882427.27631: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000097f 12081 1726882427.27744: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-00000000097f 12081 1726882427.27751: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882427.27812: no more pending results, returning what we have 12081 1726882427.27816: results queue empty 12081 1726882427.27816: checking for any_errors_fatal 12081 1726882427.27829: done checking for any_errors_fatal 12081 1726882427.27830: checking for max_fail_percentage 12081 1726882427.27832: done checking for max_fail_percentage 12081 1726882427.27833: checking to see if all hosts have failed and the running result is not ok 12081 1726882427.27834: done checking to see if all hosts have failed 12081 1726882427.27834: getting the remaining hosts for this loop 12081 1726882427.27836: done getting the remaining hosts for this loop 12081 1726882427.27840: getting the next task for host managed_node3 12081 1726882427.27850: done getting next task for host managed_node3 12081 1726882427.27854: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12081 1726882427.27859: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882427.27865: getting variables 12081 1726882427.27867: in VariableManager get_vars() 12081 1726882427.27905: Calling all_inventory to load vars for managed_node3 12081 1726882427.27908: Calling groups_inventory to load vars for managed_node3 12081 1726882427.27910: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882427.27923: Calling all_plugins_play to load vars for managed_node3 12081 1726882427.27926: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882427.27929: Calling groups_plugins_play to load vars for managed_node3 12081 1726882427.30641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882427.34325: done with get_vars() 12081 1726882427.34476: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:47 -0400 (0:00:00.129) 0:00:47.149 ****** 12081 1726882427.34700: entering _queue_task() for managed_node3/include_tasks 12081 1726882427.35410: worker is 1 (out of 1 available) 12081 1726882427.35423: exiting _queue_task() for managed_node3/include_tasks 12081 1726882427.35550: done queuing things up, now waiting for results queue to drain 12081 1726882427.35552: waiting for pending results... 12081 1726882427.36368: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12081 1726882427.36709: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000983 12081 1726882427.36730: variable 'ansible_search_path' from source: unknown 12081 1726882427.36738: variable 'ansible_search_path' from source: unknown 12081 1726882427.36785: calling self._execute() 12081 1726882427.36888: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.37080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.37096: variable 'omit' from source: magic vars 12081 1726882427.37887: variable 'ansible_distribution_major_version' from source: facts 12081 1726882427.37907: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882427.37918: _execute() done 12081 1726882427.37926: dumping result to json 12081 1726882427.37933: done dumping result, returning 12081 1726882427.37942: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000983] 12081 1726882427.37957: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000983 12081 1726882427.38099: no more pending results, returning what we have 12081 1726882427.38105: in VariableManager get_vars() 12081 1726882427.38150: Calling all_inventory to load vars for managed_node3 12081 1726882427.38153: Calling groups_inventory to load vars for managed_node3 12081 1726882427.38155: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882427.38175: Calling all_plugins_play to load vars for managed_node3 12081 1726882427.38178: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882427.38182: Calling groups_plugins_play to load vars for managed_node3 12081 1726882427.38981: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000983 12081 1726882427.38985: WORKER PROCESS EXITING 12081 1726882427.40705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882427.44332: done with get_vars() 12081 1726882427.44483: variable 'ansible_search_path' from source: unknown 12081 1726882427.44485: variable 'ansible_search_path' from source: unknown 12081 1726882427.44524: we have included files to process 12081 1726882427.44526: generating all_blocks data 12081 1726882427.44528: done generating all_blocks data 12081 1726882427.44532: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882427.44533: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882427.44536: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12081 1726882427.44961: done processing included file 12081 1726882427.44964: iterating over new_blocks loaded from include file 12081 1726882427.44966: in VariableManager get_vars() 12081 1726882427.45103: done with get_vars() 12081 1726882427.45105: filtering new block on tags 12081 1726882427.45137: done filtering new block on tags 12081 1726882427.45140: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12081 1726882427.45146: extending task lists for all hosts with included blocks 12081 1726882427.45644: done extending task lists 12081 1726882427.45645: done processing included files 12081 1726882427.45646: results queue empty 12081 1726882427.45647: checking for any_errors_fatal 12081 1726882427.45650: done checking for any_errors_fatal 12081 1726882427.45651: checking for max_fail_percentage 12081 1726882427.45652: done checking for max_fail_percentage 12081 1726882427.45653: checking to see if all hosts have failed and the running result is not ok 12081 1726882427.45654: done checking to see if all hosts have failed 12081 1726882427.45655: getting the remaining hosts for this loop 12081 1726882427.45656: done getting the remaining hosts for this loop 12081 1726882427.45659: getting the next task for host managed_node3 12081 1726882427.45665: done getting next task for host managed_node3 12081 1726882427.45667: ^ task is: TASK: Get stat for interface {{ interface }} 12081 1726882427.45672: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882427.45674: getting variables 12081 1726882427.45675: in VariableManager get_vars() 12081 1726882427.45688: Calling all_inventory to load vars for managed_node3 12081 1726882427.45691: Calling groups_inventory to load vars for managed_node3 12081 1726882427.45693: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882427.45698: Calling all_plugins_play to load vars for managed_node3 12081 1726882427.45701: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882427.45704: Calling groups_plugins_play to load vars for managed_node3 12081 1726882427.48503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882427.50755: done with get_vars() 12081 1726882427.50903: done getting variables 12081 1726882427.51184: variable 'interface' from source: task vars 12081 1726882427.51188: variable 'dhcp_interface2' from source: play vars 12081 1726882427.51542: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:47 -0400 (0:00:00.169) 0:00:47.319 ****** 12081 1726882427.51654: entering _queue_task() for managed_node3/stat 12081 1726882427.52067: worker is 1 (out of 1 available) 12081 1726882427.52080: exiting _queue_task() for managed_node3/stat 12081 1726882427.52098: done queuing things up, now waiting for results queue to drain 12081 1726882427.52099: waiting for pending results... 12081 1726882427.52402: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 12081 1726882427.52542: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a01 12081 1726882427.52565: variable 'ansible_search_path' from source: unknown 12081 1726882427.52574: variable 'ansible_search_path' from source: unknown 12081 1726882427.52614: calling self._execute() 12081 1726882427.52721: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.52731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.52746: variable 'omit' from source: magic vars 12081 1726882427.53127: variable 'ansible_distribution_major_version' from source: facts 12081 1726882427.53147: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882427.53158: variable 'omit' from source: magic vars 12081 1726882427.53241: variable 'omit' from source: magic vars 12081 1726882427.53343: variable 'interface' from source: task vars 12081 1726882427.53353: variable 'dhcp_interface2' from source: play vars 12081 1726882427.53426: variable 'dhcp_interface2' from source: play vars 12081 1726882427.53449: variable 'omit' from source: magic vars 12081 1726882427.53501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882427.53546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882427.53607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882427.53636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882427.53653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882427.53716: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882427.53725: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.53732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.53902: Set connection var ansible_pipelining to False 12081 1726882427.53911: Set connection var ansible_shell_type to sh 12081 1726882427.53924: Set connection var ansible_shell_executable to /bin/sh 12081 1726882427.53930: Set connection var ansible_connection to ssh 12081 1726882427.53939: Set connection var ansible_timeout to 10 12081 1726882427.53949: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882427.53996: variable 'ansible_shell_executable' from source: unknown 12081 1726882427.54004: variable 'ansible_connection' from source: unknown 12081 1726882427.54011: variable 'ansible_module_compression' from source: unknown 12081 1726882427.54017: variable 'ansible_shell_type' from source: unknown 12081 1726882427.54788: variable 'ansible_shell_executable' from source: unknown 12081 1726882427.54797: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.54806: variable 'ansible_pipelining' from source: unknown 12081 1726882427.54813: variable 'ansible_timeout' from source: unknown 12081 1726882427.54821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.55423: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882427.55445: variable 'omit' from source: magic vars 12081 1726882427.55458: starting attempt loop 12081 1726882427.55469: running the handler 12081 1726882427.55497: _low_level_execute_command(): starting 12081 1726882427.55509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882427.56857: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882427.56861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.56865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.56868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.56870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.56875: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882427.56877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.56879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882427.56881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882427.56883: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882427.56885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.56887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.56889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.56890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.56892: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882427.56894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.56932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882427.56955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882427.56959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882427.57114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.58807: stdout chunk (state=3): >>>/root <<< 12081 1726882427.58972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882427.58976: stdout chunk (state=3): >>><<< 12081 1726882427.58986: stderr chunk (state=3): >>><<< 12081 1726882427.59015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882427.59029: _low_level_execute_command(): starting 12081 1726882427.59035: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939 `" && echo ansible-tmp-1726882427.5901406-14239-84316709925939="` echo /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939 `" ) && sleep 0' 12081 1726882427.59668: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882427.59684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.59695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.59713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.59738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.59757: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882427.59761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.59784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882427.59788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882427.59790: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882427.59797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.59817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.59820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.59822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.59841: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882427.59844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.59912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882427.59926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882427.59931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882427.60067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.61945: stdout chunk (state=3): >>>ansible-tmp-1726882427.5901406-14239-84316709925939=/root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939 <<< 12081 1726882427.62149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882427.62153: stdout chunk (state=3): >>><<< 12081 1726882427.62155: stderr chunk (state=3): >>><<< 12081 1726882427.62474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882427.5901406-14239-84316709925939=/root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882427.62477: variable 'ansible_module_compression' from source: unknown 12081 1726882427.62480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12081 1726882427.62482: variable 'ansible_facts' from source: unknown 12081 1726882427.62484: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939/AnsiballZ_stat.py 12081 1726882427.62570: Sending initial data 12081 1726882427.62573: Sent initial data (152 bytes) 12081 1726882427.63416: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882427.63427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.63437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.63455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.63493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.63501: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882427.63510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.63521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882427.63529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882427.63535: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882427.63542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.63551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.63572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.63582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.63626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882427.63642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882427.63653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882427.63773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.65528: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882427.65622: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882427.65718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpvnmqr74x /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939/AnsiballZ_stat.py <<< 12081 1726882427.65813: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882427.67260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882427.67266: stderr chunk (state=3): >>><<< 12081 1726882427.67272: stdout chunk (state=3): >>><<< 12081 1726882427.67298: done transferring module to remote 12081 1726882427.67309: _low_level_execute_command(): starting 12081 1726882427.67314: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939/ /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939/AnsiballZ_stat.py && sleep 0' 12081 1726882427.67956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882427.67969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.67980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.67993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.68032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.68039: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882427.68048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.68067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882427.68077: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882427.68083: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882427.68091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.68100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.68110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.68117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.68127: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882427.68132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.68208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882427.68222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882427.68236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882427.68369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.70104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882427.70193: stderr chunk (state=3): >>><<< 12081 1726882427.70196: stdout chunk (state=3): >>><<< 12081 1726882427.70217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882427.70220: _low_level_execute_command(): starting 12081 1726882427.70223: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939/AnsiballZ_stat.py && sleep 0' 12081 1726882427.70879: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882427.70887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.70898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.70914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.70953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.70967: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882427.70978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.70992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882427.70999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882427.71006: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882427.71013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.71023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.71034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.71042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.71048: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882427.71060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.71132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882427.71147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882427.71152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882427.71299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.84340: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28765, "dev": 21, "nlink": 1, "atime": 1726882425.2607832, "mtime": 1726882425.2607832, "ctime": 1726882425.2607832, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12081 1726882427.85326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882427.85350: stderr chunk (state=3): >>><<< 12081 1726882427.85353: stdout chunk (state=3): >>><<< 12081 1726882427.85508: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28765, "dev": 21, "nlink": 1, "atime": 1726882425.2607832, "mtime": 1726882425.2607832, "ctime": 1726882425.2607832, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882427.85518: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882427.85522: _low_level_execute_command(): starting 12081 1726882427.85525: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882427.5901406-14239-84316709925939/ > /dev/null 2>&1 && sleep 0' 12081 1726882427.86073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882427.86088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.86103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.86121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.86162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.86181: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882427.86198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.86214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882427.86225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882427.86236: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882427.86248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882427.86263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882427.86284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882427.86297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882427.86308: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882427.86322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882427.86398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882427.86417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882427.86430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882427.86561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882427.88451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882427.88454: stderr chunk (state=3): >>><<< 12081 1726882427.88457: stdout chunk (state=3): >>><<< 12081 1726882427.88673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882427.88676: handler run complete 12081 1726882427.88678: attempt loop complete, returning result 12081 1726882427.88680: _execute() done 12081 1726882427.88681: dumping result to json 12081 1726882427.88683: done dumping result, returning 12081 1726882427.88684: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0e448fcc-3ce9-0a3f-ff3c-000000000a01] 12081 1726882427.88686: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a01 12081 1726882427.88756: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a01 12081 1726882427.88760: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882425.2607832, "block_size": 4096, "blocks": 0, "ctime": 1726882425.2607832, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28765, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882425.2607832, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12081 1726882427.88866: no more pending results, returning what we have 12081 1726882427.88870: results queue empty 12081 1726882427.88870: checking for any_errors_fatal 12081 1726882427.88871: done checking for any_errors_fatal 12081 1726882427.88872: checking for max_fail_percentage 12081 1726882427.88874: done checking for max_fail_percentage 12081 1726882427.88874: checking to see if all hosts have failed and the running result is not ok 12081 1726882427.88875: done checking to see if all hosts have failed 12081 1726882427.88876: getting the remaining hosts for this loop 12081 1726882427.88877: done getting the remaining hosts for this loop 12081 1726882427.88880: getting the next task for host managed_node3 12081 1726882427.88890: done getting next task for host managed_node3 12081 1726882427.88892: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12081 1726882427.88896: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882427.88905: getting variables 12081 1726882427.88907: in VariableManager get_vars() 12081 1726882427.88942: Calling all_inventory to load vars for managed_node3 12081 1726882427.88944: Calling groups_inventory to load vars for managed_node3 12081 1726882427.88947: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882427.88957: Calling all_plugins_play to load vars for managed_node3 12081 1726882427.88959: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882427.88962: Calling groups_plugins_play to load vars for managed_node3 12081 1726882427.91041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882427.92058: done with get_vars() 12081 1726882427.92086: done getting variables 12081 1726882427.92154: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882427.92283: variable 'interface' from source: task vars 12081 1726882427.92287: variable 'dhcp_interface2' from source: play vars 12081 1726882427.92345: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:47 -0400 (0:00:00.407) 0:00:47.726 ****** 12081 1726882427.92388: entering _queue_task() for managed_node3/assert 12081 1726882427.93102: worker is 1 (out of 1 available) 12081 1726882427.93114: exiting _queue_task() for managed_node3/assert 12081 1726882427.93126: done queuing things up, now waiting for results queue to drain 12081 1726882427.93127: waiting for pending results... 12081 1726882427.93425: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 12081 1726882427.93550: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000984 12081 1726882427.93567: variable 'ansible_search_path' from source: unknown 12081 1726882427.93573: variable 'ansible_search_path' from source: unknown 12081 1726882427.93615: calling self._execute() 12081 1726882427.93717: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.93721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.93731: variable 'omit' from source: magic vars 12081 1726882427.94100: variable 'ansible_distribution_major_version' from source: facts 12081 1726882427.94113: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882427.94119: variable 'omit' from source: magic vars 12081 1726882427.94189: variable 'omit' from source: magic vars 12081 1726882427.94289: variable 'interface' from source: task vars 12081 1726882427.94293: variable 'dhcp_interface2' from source: play vars 12081 1726882427.94361: variable 'dhcp_interface2' from source: play vars 12081 1726882427.94380: variable 'omit' from source: magic vars 12081 1726882427.94422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882427.94462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882427.94486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882427.94502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882427.94512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882427.94541: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882427.94544: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.94546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.94651: Set connection var ansible_pipelining to False 12081 1726882427.94654: Set connection var ansible_shell_type to sh 12081 1726882427.94666: Set connection var ansible_shell_executable to /bin/sh 12081 1726882427.94670: Set connection var ansible_connection to ssh 12081 1726882427.94686: Set connection var ansible_timeout to 10 12081 1726882427.94718: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882427.94744: variable 'ansible_shell_executable' from source: unknown 12081 1726882427.94747: variable 'ansible_connection' from source: unknown 12081 1726882427.94750: variable 'ansible_module_compression' from source: unknown 12081 1726882427.94752: variable 'ansible_shell_type' from source: unknown 12081 1726882427.94754: variable 'ansible_shell_executable' from source: unknown 12081 1726882427.94759: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882427.94761: variable 'ansible_pipelining' from source: unknown 12081 1726882427.94765: variable 'ansible_timeout' from source: unknown 12081 1726882427.94771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882427.94909: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882427.94918: variable 'omit' from source: magic vars 12081 1726882427.94931: starting attempt loop 12081 1726882427.94934: running the handler 12081 1726882427.95727: variable 'interface_stat' from source: set_fact 12081 1726882427.95748: Evaluated conditional (interface_stat.stat.exists): True 12081 1726882427.95753: handler run complete 12081 1726882427.95776: attempt loop complete, returning result 12081 1726882427.95779: _execute() done 12081 1726882427.95782: dumping result to json 12081 1726882427.95784: done dumping result, returning 12081 1726882427.95790: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0e448fcc-3ce9-0a3f-ff3c-000000000984] 12081 1726882427.95797: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000984 12081 1726882427.95891: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000984 12081 1726882427.95893: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12081 1726882427.95973: no more pending results, returning what we have 12081 1726882427.95977: results queue empty 12081 1726882427.95977: checking for any_errors_fatal 12081 1726882427.95988: done checking for any_errors_fatal 12081 1726882427.95989: checking for max_fail_percentage 12081 1726882427.95991: done checking for max_fail_percentage 12081 1726882427.95992: checking to see if all hosts have failed and the running result is not ok 12081 1726882427.95993: done checking to see if all hosts have failed 12081 1726882427.95994: getting the remaining hosts for this loop 12081 1726882427.95996: done getting the remaining hosts for this loop 12081 1726882427.96000: getting the next task for host managed_node3 12081 1726882427.96010: done getting next task for host managed_node3 12081 1726882427.96014: ^ task is: TASK: Test 12081 1726882427.96017: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882427.96022: getting variables 12081 1726882427.96024: in VariableManager get_vars() 12081 1726882427.96069: Calling all_inventory to load vars for managed_node3 12081 1726882427.96072: Calling groups_inventory to load vars for managed_node3 12081 1726882427.96075: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882427.96088: Calling all_plugins_play to load vars for managed_node3 12081 1726882427.96091: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882427.96093: Calling groups_plugins_play to load vars for managed_node3 12081 1726882427.98097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.00584: done with get_vars() 12081 1726882428.00617: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:33:48 -0400 (0:00:00.083) 0:00:47.810 ****** 12081 1726882428.00714: entering _queue_task() for managed_node3/include_tasks 12081 1726882428.01051: worker is 1 (out of 1 available) 12081 1726882428.01070: exiting _queue_task() for managed_node3/include_tasks 12081 1726882428.01086: done queuing things up, now waiting for results queue to drain 12081 1726882428.01087: waiting for pending results... 12081 1726882428.01454: running TaskExecutor() for managed_node3/TASK: Test 12081 1726882428.01561: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008ee 12081 1726882428.01575: variable 'ansible_search_path' from source: unknown 12081 1726882428.01578: variable 'ansible_search_path' from source: unknown 12081 1726882428.01917: variable 'lsr_test' from source: include params 12081 1726882428.01921: variable 'lsr_test' from source: include params 12081 1726882428.01924: variable 'omit' from source: magic vars 12081 1726882428.02203: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.02207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.02209: variable 'omit' from source: magic vars 12081 1726882428.03017: variable 'ansible_distribution_major_version' from source: facts 12081 1726882428.03022: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882428.03025: variable 'item' from source: unknown 12081 1726882428.03027: variable 'item' from source: unknown 12081 1726882428.03028: variable 'item' from source: unknown 12081 1726882428.03030: variable 'item' from source: unknown 12081 1726882428.03133: dumping result to json 12081 1726882428.03137: done dumping result, returning 12081 1726882428.03140: done running TaskExecutor() for managed_node3/TASK: Test [0e448fcc-3ce9-0a3f-ff3c-0000000008ee] 12081 1726882428.03143: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ee 12081 1726882428.03185: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ee 12081 1726882428.03189: WORKER PROCESS EXITING 12081 1726882428.03211: no more pending results, returning what we have 12081 1726882428.03215: in VariableManager get_vars() 12081 1726882428.03257: Calling all_inventory to load vars for managed_node3 12081 1726882428.03260: Calling groups_inventory to load vars for managed_node3 12081 1726882428.03267: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882428.03279: Calling all_plugins_play to load vars for managed_node3 12081 1726882428.03282: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882428.03285: Calling groups_plugins_play to load vars for managed_node3 12081 1726882428.05241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.07959: done with get_vars() 12081 1726882428.07994: variable 'ansible_search_path' from source: unknown 12081 1726882428.07996: variable 'ansible_search_path' from source: unknown 12081 1726882428.08037: we have included files to process 12081 1726882428.08038: generating all_blocks data 12081 1726882428.08040: done generating all_blocks data 12081 1726882428.08045: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 12081 1726882428.08046: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 12081 1726882428.08049: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 12081 1726882428.08319: in VariableManager get_vars() 12081 1726882428.08344: done with get_vars() 12081 1726882428.08348: variable 'omit' from source: magic vars 12081 1726882428.08392: variable 'omit' from source: magic vars 12081 1726882428.08444: in VariableManager get_vars() 12081 1726882428.08462: done with get_vars() 12081 1726882428.08490: in VariableManager get_vars() 12081 1726882428.08507: done with get_vars() 12081 1726882428.08543: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12081 1726882428.08739: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12081 1726882428.08825: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12081 1726882428.09494: in VariableManager get_vars() 12081 1726882428.09517: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882428.11743: done processing included file 12081 1726882428.11746: iterating over new_blocks loaded from include file 12081 1726882428.11747: in VariableManager get_vars() 12081 1726882428.11791: done with get_vars() 12081 1726882428.11793: filtering new block on tags 12081 1726882428.12066: done filtering new block on tags 12081 1726882428.12071: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml for managed_node3 => (item=tasks/create_bond_profile_reconfigure.yml) 12081 1726882428.12077: extending task lists for all hosts with included blocks 12081 1726882428.14378: done extending task lists 12081 1726882428.14380: done processing included files 12081 1726882428.14381: results queue empty 12081 1726882428.14382: checking for any_errors_fatal 12081 1726882428.14386: done checking for any_errors_fatal 12081 1726882428.14386: checking for max_fail_percentage 12081 1726882428.14388: done checking for max_fail_percentage 12081 1726882428.14389: checking to see if all hosts have failed and the running result is not ok 12081 1726882428.14390: done checking to see if all hosts have failed 12081 1726882428.14390: getting the remaining hosts for this loop 12081 1726882428.14392: done getting the remaining hosts for this loop 12081 1726882428.14395: getting the next task for host managed_node3 12081 1726882428.14400: done getting next task for host managed_node3 12081 1726882428.14403: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882428.14407: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882428.14419: getting variables 12081 1726882428.14420: in VariableManager get_vars() 12081 1726882428.14441: Calling all_inventory to load vars for managed_node3 12081 1726882428.14444: Calling groups_inventory to load vars for managed_node3 12081 1726882428.14446: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882428.14452: Calling all_plugins_play to load vars for managed_node3 12081 1726882428.14458: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882428.14462: Calling groups_plugins_play to load vars for managed_node3 12081 1726882428.18018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.19856: done with get_vars() 12081 1726882428.19890: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:48 -0400 (0:00:00.192) 0:00:48.002 ****** 12081 1726882428.19984: entering _queue_task() for managed_node3/include_tasks 12081 1726882428.21109: worker is 1 (out of 1 available) 12081 1726882428.21145: exiting _queue_task() for managed_node3/include_tasks 12081 1726882428.21240: done queuing things up, now waiting for results queue to drain 12081 1726882428.21241: waiting for pending results... 12081 1726882428.21821: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882428.22147: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a2e 12081 1726882428.22165: variable 'ansible_search_path' from source: unknown 12081 1726882428.22186: variable 'ansible_search_path' from source: unknown 12081 1726882428.22331: calling self._execute() 12081 1726882428.22533: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.22537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.22547: variable 'omit' from source: magic vars 12081 1726882428.23000: variable 'ansible_distribution_major_version' from source: facts 12081 1726882428.23014: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882428.23026: _execute() done 12081 1726882428.23030: dumping result to json 12081 1726882428.23034: done dumping result, returning 12081 1726882428.23042: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-0a3f-ff3c-000000000a2e] 12081 1726882428.23049: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a2e 12081 1726882428.23162: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a2e 12081 1726882428.23167: WORKER PROCESS EXITING 12081 1726882428.23217: no more pending results, returning what we have 12081 1726882428.23223: in VariableManager get_vars() 12081 1726882428.23287: Calling all_inventory to load vars for managed_node3 12081 1726882428.23290: Calling groups_inventory to load vars for managed_node3 12081 1726882428.23293: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882428.23307: Calling all_plugins_play to load vars for managed_node3 12081 1726882428.23310: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882428.23313: Calling groups_plugins_play to load vars for managed_node3 12081 1726882428.25779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.29682: done with get_vars() 12081 1726882428.29717: variable 'ansible_search_path' from source: unknown 12081 1726882428.29719: variable 'ansible_search_path' from source: unknown 12081 1726882428.29786: we have included files to process 12081 1726882428.29788: generating all_blocks data 12081 1726882428.29789: done generating all_blocks data 12081 1726882428.29791: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882428.29792: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882428.29795: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882428.30983: done processing included file 12081 1726882428.30986: iterating over new_blocks loaded from include file 12081 1726882428.30987: in VariableManager get_vars() 12081 1726882428.31047: done with get_vars() 12081 1726882428.31049: filtering new block on tags 12081 1726882428.31117: done filtering new block on tags 12081 1726882428.31120: in VariableManager get_vars() 12081 1726882428.31180: done with get_vars() 12081 1726882428.31182: filtering new block on tags 12081 1726882428.31287: done filtering new block on tags 12081 1726882428.31291: in VariableManager get_vars() 12081 1726882428.31319: done with get_vars() 12081 1726882428.31321: filtering new block on tags 12081 1726882428.31472: done filtering new block on tags 12081 1726882428.31478: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12081 1726882428.31484: extending task lists for all hosts with included blocks 12081 1726882428.34433: done extending task lists 12081 1726882428.34436: done processing included files 12081 1726882428.34437: results queue empty 12081 1726882428.34437: checking for any_errors_fatal 12081 1726882428.34447: done checking for any_errors_fatal 12081 1726882428.34448: checking for max_fail_percentage 12081 1726882428.34450: done checking for max_fail_percentage 12081 1726882428.34451: checking to see if all hosts have failed and the running result is not ok 12081 1726882428.34452: done checking to see if all hosts have failed 12081 1726882428.34455: getting the remaining hosts for this loop 12081 1726882428.34456: done getting the remaining hosts for this loop 12081 1726882428.34460: getting the next task for host managed_node3 12081 1726882428.34466: done getting next task for host managed_node3 12081 1726882428.34469: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882428.34473: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882428.34486: getting variables 12081 1726882428.34487: in VariableManager get_vars() 12081 1726882428.34508: Calling all_inventory to load vars for managed_node3 12081 1726882428.34511: Calling groups_inventory to load vars for managed_node3 12081 1726882428.34513: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882428.34519: Calling all_plugins_play to load vars for managed_node3 12081 1726882428.34521: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882428.34524: Calling groups_plugins_play to load vars for managed_node3 12081 1726882428.36550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.39371: done with get_vars() 12081 1726882428.39403: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:48 -0400 (0:00:00.195) 0:00:48.197 ****** 12081 1726882428.39497: entering _queue_task() for managed_node3/setup 12081 1726882428.39861: worker is 1 (out of 1 available) 12081 1726882428.39879: exiting _queue_task() for managed_node3/setup 12081 1726882428.39893: done queuing things up, now waiting for results queue to drain 12081 1726882428.39894: waiting for pending results... 12081 1726882428.40231: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882428.40412: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000b10 12081 1726882428.40425: variable 'ansible_search_path' from source: unknown 12081 1726882428.40428: variable 'ansible_search_path' from source: unknown 12081 1726882428.40477: calling self._execute() 12081 1726882428.40576: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.40581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.40591: variable 'omit' from source: magic vars 12081 1726882428.40978: variable 'ansible_distribution_major_version' from source: facts 12081 1726882428.40996: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882428.41241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882428.43412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882428.43467: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882428.43495: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882428.43527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882428.43547: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882428.43612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882428.43633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882428.43652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882428.43686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882428.43702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882428.43740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882428.43757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882428.43778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882428.43809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882428.43820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882428.43932: variable '__network_required_facts' from source: role '' defaults 12081 1726882428.43944: variable 'ansible_facts' from source: unknown 12081 1726882428.44915: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12081 1726882428.44919: when evaluation is False, skipping this task 12081 1726882428.44922: _execute() done 12081 1726882428.44924: dumping result to json 12081 1726882428.44928: done dumping result, returning 12081 1726882428.44934: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-0a3f-ff3c-000000000b10] 12081 1726882428.44957: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b10 12081 1726882428.45045: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b10 12081 1726882428.45049: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882428.45102: no more pending results, returning what we have 12081 1726882428.45107: results queue empty 12081 1726882428.45108: checking for any_errors_fatal 12081 1726882428.45110: done checking for any_errors_fatal 12081 1726882428.45110: checking for max_fail_percentage 12081 1726882428.45112: done checking for max_fail_percentage 12081 1726882428.45113: checking to see if all hosts have failed and the running result is not ok 12081 1726882428.45114: done checking to see if all hosts have failed 12081 1726882428.45115: getting the remaining hosts for this loop 12081 1726882428.45117: done getting the remaining hosts for this loop 12081 1726882428.45122: getting the next task for host managed_node3 12081 1726882428.45134: done getting next task for host managed_node3 12081 1726882428.45139: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882428.45145: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882428.45173: getting variables 12081 1726882428.45176: in VariableManager get_vars() 12081 1726882428.45224: Calling all_inventory to load vars for managed_node3 12081 1726882428.45227: Calling groups_inventory to load vars for managed_node3 12081 1726882428.45230: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882428.45242: Calling all_plugins_play to load vars for managed_node3 12081 1726882428.45246: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882428.45258: Calling groups_plugins_play to load vars for managed_node3 12081 1726882428.46347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.52886: done with get_vars() 12081 1726882428.52926: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:48 -0400 (0:00:00.135) 0:00:48.333 ****** 12081 1726882428.53029: entering _queue_task() for managed_node3/stat 12081 1726882428.53391: worker is 1 (out of 1 available) 12081 1726882428.53403: exiting _queue_task() for managed_node3/stat 12081 1726882428.53417: done queuing things up, now waiting for results queue to drain 12081 1726882428.53418: waiting for pending results... 12081 1726882428.53746: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882428.53933: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000b12 12081 1726882428.53946: variable 'ansible_search_path' from source: unknown 12081 1726882428.53950: variable 'ansible_search_path' from source: unknown 12081 1726882428.53998: calling self._execute() 12081 1726882428.54106: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.54111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.54120: variable 'omit' from source: magic vars 12081 1726882428.54507: variable 'ansible_distribution_major_version' from source: facts 12081 1726882428.54533: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882428.54711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882428.55013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882428.55058: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882428.55130: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882428.55167: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882428.55257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882428.55290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882428.55321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882428.55347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882428.55447: variable '__network_is_ostree' from source: set_fact 12081 1726882428.55454: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882428.55462: when evaluation is False, skipping this task 12081 1726882428.55465: _execute() done 12081 1726882428.55468: dumping result to json 12081 1726882428.55475: done dumping result, returning 12081 1726882428.55481: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-0a3f-ff3c-000000000b12] 12081 1726882428.55487: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b12 12081 1726882428.55595: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b12 12081 1726882428.55598: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882428.55660: no more pending results, returning what we have 12081 1726882428.55668: results queue empty 12081 1726882428.55669: checking for any_errors_fatal 12081 1726882428.55678: done checking for any_errors_fatal 12081 1726882428.55679: checking for max_fail_percentage 12081 1726882428.55681: done checking for max_fail_percentage 12081 1726882428.55682: checking to see if all hosts have failed and the running result is not ok 12081 1726882428.55683: done checking to see if all hosts have failed 12081 1726882428.55684: getting the remaining hosts for this loop 12081 1726882428.55686: done getting the remaining hosts for this loop 12081 1726882428.55690: getting the next task for host managed_node3 12081 1726882428.55700: done getting next task for host managed_node3 12081 1726882428.55703: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882428.55710: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882428.55734: getting variables 12081 1726882428.55736: in VariableManager get_vars() 12081 1726882428.55788: Calling all_inventory to load vars for managed_node3 12081 1726882428.55791: Calling groups_inventory to load vars for managed_node3 12081 1726882428.55794: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882428.55806: Calling all_plugins_play to load vars for managed_node3 12081 1726882428.55809: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882428.55812: Calling groups_plugins_play to load vars for managed_node3 12081 1726882428.57624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.59402: done with get_vars() 12081 1726882428.59439: done getting variables 12081 1726882428.59506: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:48 -0400 (0:00:00.065) 0:00:48.398 ****** 12081 1726882428.59562: entering _queue_task() for managed_node3/set_fact 12081 1726882428.59921: worker is 1 (out of 1 available) 12081 1726882428.59933: exiting _queue_task() for managed_node3/set_fact 12081 1726882428.59945: done queuing things up, now waiting for results queue to drain 12081 1726882428.59947: waiting for pending results... 12081 1726882428.60260: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882428.60452: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000b13 12081 1726882428.60468: variable 'ansible_search_path' from source: unknown 12081 1726882428.60472: variable 'ansible_search_path' from source: unknown 12081 1726882428.60514: calling self._execute() 12081 1726882428.60616: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.60621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.60635: variable 'omit' from source: magic vars 12081 1726882428.61031: variable 'ansible_distribution_major_version' from source: facts 12081 1726882428.61154: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882428.61342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882428.61647: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882428.61780: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882428.61855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882428.61984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882428.62115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882428.62256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882428.62291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882428.62316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882428.62525: variable '__network_is_ostree' from source: set_fact 12081 1726882428.62533: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882428.62536: when evaluation is False, skipping this task 12081 1726882428.62539: _execute() done 12081 1726882428.62541: dumping result to json 12081 1726882428.62544: done dumping result, returning 12081 1726882428.62553: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-0a3f-ff3c-000000000b13] 12081 1726882428.62677: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b13 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882428.62830: no more pending results, returning what we have 12081 1726882428.62835: results queue empty 12081 1726882428.62836: checking for any_errors_fatal 12081 1726882428.62841: done checking for any_errors_fatal 12081 1726882428.62842: checking for max_fail_percentage 12081 1726882428.62844: done checking for max_fail_percentage 12081 1726882428.62845: checking to see if all hosts have failed and the running result is not ok 12081 1726882428.62846: done checking to see if all hosts have failed 12081 1726882428.62847: getting the remaining hosts for this loop 12081 1726882428.62849: done getting the remaining hosts for this loop 12081 1726882428.62856: getting the next task for host managed_node3 12081 1726882428.62873: done getting next task for host managed_node3 12081 1726882428.62878: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882428.62884: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882428.62903: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b13 12081 1726882428.62908: WORKER PROCESS EXITING 12081 1726882428.62921: getting variables 12081 1726882428.62924: in VariableManager get_vars() 12081 1726882428.62978: Calling all_inventory to load vars for managed_node3 12081 1726882428.62981: Calling groups_inventory to load vars for managed_node3 12081 1726882428.62984: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882428.62996: Calling all_plugins_play to load vars for managed_node3 12081 1726882428.63000: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882428.63003: Calling groups_plugins_play to load vars for managed_node3 12081 1726882428.66277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882428.69749: done with get_vars() 12081 1726882428.69786: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:48 -0400 (0:00:00.103) 0:00:48.502 ****** 12081 1726882428.69918: entering _queue_task() for managed_node3/service_facts 12081 1726882428.70945: worker is 1 (out of 1 available) 12081 1726882428.70961: exiting _queue_task() for managed_node3/service_facts 12081 1726882428.70975: done queuing things up, now waiting for results queue to drain 12081 1726882428.70977: waiting for pending results... 12081 1726882428.71468: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882428.71639: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000b15 12081 1726882428.71652: variable 'ansible_search_path' from source: unknown 12081 1726882428.71655: variable 'ansible_search_path' from source: unknown 12081 1726882428.71701: calling self._execute() 12081 1726882428.71801: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.71811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.71820: variable 'omit' from source: magic vars 12081 1726882428.72206: variable 'ansible_distribution_major_version' from source: facts 12081 1726882428.72224: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882428.72230: variable 'omit' from source: magic vars 12081 1726882428.72323: variable 'omit' from source: magic vars 12081 1726882428.72368: variable 'omit' from source: magic vars 12081 1726882428.72408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882428.72444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882428.72471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882428.72489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882428.72501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882428.72529: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882428.72533: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.72535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.72673: Set connection var ansible_pipelining to False 12081 1726882428.72677: Set connection var ansible_shell_type to sh 12081 1726882428.72684: Set connection var ansible_shell_executable to /bin/sh 12081 1726882428.72687: Set connection var ansible_connection to ssh 12081 1726882428.72692: Set connection var ansible_timeout to 10 12081 1726882428.72697: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882428.72730: variable 'ansible_shell_executable' from source: unknown 12081 1726882428.72733: variable 'ansible_connection' from source: unknown 12081 1726882428.72736: variable 'ansible_module_compression' from source: unknown 12081 1726882428.72739: variable 'ansible_shell_type' from source: unknown 12081 1726882428.72741: variable 'ansible_shell_executable' from source: unknown 12081 1726882428.72743: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882428.72746: variable 'ansible_pipelining' from source: unknown 12081 1726882428.72748: variable 'ansible_timeout' from source: unknown 12081 1726882428.72751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882428.73047: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882428.73057: variable 'omit' from source: magic vars 12081 1726882428.73066: starting attempt loop 12081 1726882428.73069: running the handler 12081 1726882428.73082: _low_level_execute_command(): starting 12081 1726882428.73095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882428.73889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882428.73901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.73910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.73926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.73965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.73974: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882428.73987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.74000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882428.74008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882428.74015: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882428.74022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.74031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.74043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.74051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.74061: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882428.74072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.74144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882428.74162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882428.74168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882428.74311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882428.76014: stdout chunk (state=3): >>>/root <<< 12081 1726882428.76202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882428.76206: stdout chunk (state=3): >>><<< 12081 1726882428.76209: stderr chunk (state=3): >>><<< 12081 1726882428.76362: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882428.76368: _low_level_execute_command(): starting 12081 1726882428.76371: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505 `" && echo ansible-tmp-1726882428.7625327-14281-45452436779505="` echo /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505 `" ) && sleep 0' 12081 1726882428.76954: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882428.76972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.76987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.77009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.77054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.77070: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882428.77084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.77100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882428.77110: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882428.77125: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882428.77140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.77152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.77168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.77179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.77189: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882428.77201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.77286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882428.77308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882428.77324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882428.77738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882428.79344: stdout chunk (state=3): >>>ansible-tmp-1726882428.7625327-14281-45452436779505=/root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505 <<< 12081 1726882428.79551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882428.79557: stdout chunk (state=3): >>><<< 12081 1726882428.79559: stderr chunk (state=3): >>><<< 12081 1726882428.79874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882428.7625327-14281-45452436779505=/root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882428.79877: variable 'ansible_module_compression' from source: unknown 12081 1726882428.79880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12081 1726882428.79882: variable 'ansible_facts' from source: unknown 12081 1726882428.79884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505/AnsiballZ_service_facts.py 12081 1726882428.79995: Sending initial data 12081 1726882428.79998: Sent initial data (161 bytes) 12081 1726882428.82235: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882428.82255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.82281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.82300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.82341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.82356: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882428.82374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.82392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882428.82404: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882428.82416: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882428.82428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.82443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.82467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.82481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.82492: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882428.82507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.82587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882428.82604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882428.82619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882428.82749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882428.84523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882428.84613: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882428.84714: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpi0ak836h /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505/AnsiballZ_service_facts.py <<< 12081 1726882428.84808: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882428.86264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882428.86372: stderr chunk (state=3): >>><<< 12081 1726882428.86377: stdout chunk (state=3): >>><<< 12081 1726882428.86380: done transferring module to remote 12081 1726882428.86386: _low_level_execute_command(): starting 12081 1726882428.86388: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505/ /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505/AnsiballZ_service_facts.py && sleep 0' 12081 1726882428.87038: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882428.87057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.87081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.87100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.87146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.87162: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882428.87183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.87202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882428.87215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882428.87227: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882428.87244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.87262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882428.87285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.87299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882428.87310: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882428.87322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.87404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882428.87422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882428.87436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882428.87576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882428.89290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882428.89339: stderr chunk (state=3): >>><<< 12081 1726882428.89343: stdout chunk (state=3): >>><<< 12081 1726882428.89361: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882428.89366: _low_level_execute_command(): starting 12081 1726882428.89371: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505/AnsiballZ_service_facts.py && sleep 0' 12081 1726882428.89818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.89824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882428.89869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882428.89873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882428.89885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882428.89939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882428.89943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882428.89949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882428.90052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882430.20068: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 12081 1726882430.20122: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 12081 1726882430.20136: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 12081 1726882430.20140: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "m<<< 12081 1726882430.20144: stdout chunk (state=3): >>>an-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "sta<<< 12081 1726882430.20147: stdout chunk (state=3): >>>tic", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12081 1726882430.21484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882430.21488: stderr chunk (state=3): >>><<< 12081 1726882430.21498: stdout chunk (state=3): >>><<< 12081 1726882430.21530: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882430.22227: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882430.22237: _low_level_execute_command(): starting 12081 1726882430.22239: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882428.7625327-14281-45452436779505/ > /dev/null 2>&1 && sleep 0' 12081 1726882430.22923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882430.22929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882430.22944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.22960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.23000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882430.23007: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882430.23017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.23029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882430.23036: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882430.23044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882430.23055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882430.23070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.23081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.23088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882430.23094: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882430.23103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.23181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882430.23196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882430.23201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882430.23368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882430.25155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882430.25201: stderr chunk (state=3): >>><<< 12081 1726882430.25207: stdout chunk (state=3): >>><<< 12081 1726882430.25255: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882430.25258: handler run complete 12081 1726882430.25447: variable 'ansible_facts' from source: unknown 12081 1726882430.25602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882430.26060: variable 'ansible_facts' from source: unknown 12081 1726882430.26193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882430.26381: attempt loop complete, returning result 12081 1726882430.26384: _execute() done 12081 1726882430.26387: dumping result to json 12081 1726882430.26440: done dumping result, returning 12081 1726882430.26450: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-0a3f-ff3c-000000000b15] 12081 1726882430.26466: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b15 12081 1726882430.27202: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b15 12081 1726882430.27204: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882430.27303: no more pending results, returning what we have 12081 1726882430.27306: results queue empty 12081 1726882430.27306: checking for any_errors_fatal 12081 1726882430.27311: done checking for any_errors_fatal 12081 1726882430.27314: checking for max_fail_percentage 12081 1726882430.27315: done checking for max_fail_percentage 12081 1726882430.27316: checking to see if all hosts have failed and the running result is not ok 12081 1726882430.27317: done checking to see if all hosts have failed 12081 1726882430.27318: getting the remaining hosts for this loop 12081 1726882430.27320: done getting the remaining hosts for this loop 12081 1726882430.27323: getting the next task for host managed_node3 12081 1726882430.27329: done getting next task for host managed_node3 12081 1726882430.27333: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882430.27338: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882430.27353: getting variables 12081 1726882430.27354: in VariableManager get_vars() 12081 1726882430.27396: Calling all_inventory to load vars for managed_node3 12081 1726882430.27399: Calling groups_inventory to load vars for managed_node3 12081 1726882430.27401: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882430.27414: Calling all_plugins_play to load vars for managed_node3 12081 1726882430.27416: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882430.27419: Calling groups_plugins_play to load vars for managed_node3 12081 1726882430.29469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882430.31758: done with get_vars() 12081 1726882430.31791: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:50 -0400 (0:00:01.619) 0:00:50.121 ****** 12081 1726882430.31874: entering _queue_task() for managed_node3/package_facts 12081 1726882430.32125: worker is 1 (out of 1 available) 12081 1726882430.32137: exiting _queue_task() for managed_node3/package_facts 12081 1726882430.32151: done queuing things up, now waiting for results queue to drain 12081 1726882430.32155: waiting for pending results... 12081 1726882430.32334: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882430.32474: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000b16 12081 1726882430.32491: variable 'ansible_search_path' from source: unknown 12081 1726882430.32497: variable 'ansible_search_path' from source: unknown 12081 1726882430.32532: calling self._execute() 12081 1726882430.32613: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882430.32622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882430.32633: variable 'omit' from source: magic vars 12081 1726882430.32965: variable 'ansible_distribution_major_version' from source: facts 12081 1726882430.32987: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882430.32990: variable 'omit' from source: magic vars 12081 1726882430.33052: variable 'omit' from source: magic vars 12081 1726882430.33098: variable 'omit' from source: magic vars 12081 1726882430.33129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882430.33155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882430.33602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882430.33605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882430.33607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882430.33610: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882430.33612: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882430.33615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882430.33617: Set connection var ansible_pipelining to False 12081 1726882430.33620: Set connection var ansible_shell_type to sh 12081 1726882430.33622: Set connection var ansible_shell_executable to /bin/sh 12081 1726882430.33624: Set connection var ansible_connection to ssh 12081 1726882430.33627: Set connection var ansible_timeout to 10 12081 1726882430.33629: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882430.33631: variable 'ansible_shell_executable' from source: unknown 12081 1726882430.33633: variable 'ansible_connection' from source: unknown 12081 1726882430.33636: variable 'ansible_module_compression' from source: unknown 12081 1726882430.33638: variable 'ansible_shell_type' from source: unknown 12081 1726882430.33640: variable 'ansible_shell_executable' from source: unknown 12081 1726882430.33642: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882430.33644: variable 'ansible_pipelining' from source: unknown 12081 1726882430.33646: variable 'ansible_timeout' from source: unknown 12081 1726882430.33649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882430.34177: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882430.34182: variable 'omit' from source: magic vars 12081 1726882430.34184: starting attempt loop 12081 1726882430.34186: running the handler 12081 1726882430.34188: _low_level_execute_command(): starting 12081 1726882430.34190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882430.35227: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.35236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.35277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882430.35290: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882430.35307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.35329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882430.35341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882430.35354: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882430.35368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882430.35382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.35395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.35407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882430.35421: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882430.35438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.35515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882430.35543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882430.35561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882430.35709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882430.37319: stdout chunk (state=3): >>>/root <<< 12081 1726882430.37422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882430.37485: stderr chunk (state=3): >>><<< 12081 1726882430.37501: stdout chunk (state=3): >>><<< 12081 1726882430.37560: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882430.37566: _low_level_execute_command(): starting 12081 1726882430.37569: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435 `" && echo ansible-tmp-1726882430.375176-14352-123674581149435="` echo /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435 `" ) && sleep 0' 12081 1726882430.38228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882430.38232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.38281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.38288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.38303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882430.38308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.38397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882430.38416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882430.38540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882430.40405: stdout chunk (state=3): >>>ansible-tmp-1726882430.375176-14352-123674581149435=/root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435 <<< 12081 1726882430.40514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882430.40568: stderr chunk (state=3): >>><<< 12081 1726882430.40571: stdout chunk (state=3): >>><<< 12081 1726882430.40587: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882430.375176-14352-123674581149435=/root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882430.40630: variable 'ansible_module_compression' from source: unknown 12081 1726882430.40675: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12081 1726882430.40726: variable 'ansible_facts' from source: unknown 12081 1726882430.40862: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435/AnsiballZ_package_facts.py 12081 1726882430.40980: Sending initial data 12081 1726882430.40984: Sent initial data (161 bytes) 12081 1726882430.41655: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882430.41668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.41715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882430.41718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.41721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.41723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.41774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882430.41784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882430.41897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882430.43612: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882430.43709: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882430.43808: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpy3348m6g /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435/AnsiballZ_package_facts.py <<< 12081 1726882430.43902: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882430.45898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882430.46006: stderr chunk (state=3): >>><<< 12081 1726882430.46009: stdout chunk (state=3): >>><<< 12081 1726882430.46028: done transferring module to remote 12081 1726882430.46037: _low_level_execute_command(): starting 12081 1726882430.46042: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435/ /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435/AnsiballZ_package_facts.py && sleep 0' 12081 1726882430.46501: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882430.46508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.46540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882430.46547: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.46560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882430.46568: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.46579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.46585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882430.46590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.46646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882430.46651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882430.46775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882430.48536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882430.48587: stderr chunk (state=3): >>><<< 12081 1726882430.48590: stdout chunk (state=3): >>><<< 12081 1726882430.48604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882430.48607: _low_level_execute_command(): starting 12081 1726882430.48611: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435/AnsiballZ_package_facts.py && sleep 0' 12081 1726882430.49048: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882430.49054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882430.49103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.49107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882430.49109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882430.49158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882430.49178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882430.49289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882430.94913: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 12081 1726882430.94945: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 12081 1726882430.94978: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 12081 1726882430.95021: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 12081 1726882430.95038: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 12081 1726882430.95041: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 12081 1726882430.95049: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 12081 1726882430.95056: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 12081 1726882430.95068: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 12081 1726882430.95071: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 12081 1726882430.95076: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 12081 1726882430.95082: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 12081 1726882430.95116: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 12081 1726882430.95130: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 12081 1726882430.95149: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12081 1726882430.96722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882430.96725: stdout chunk (state=3): >>><<< 12081 1726882430.96728: stderr chunk (state=3): >>><<< 12081 1726882430.97078: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882430.99947: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882430.99969: _low_level_execute_command(): starting 12081 1726882430.99973: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882430.375176-14352-123674581149435/ > /dev/null 2>&1 && sleep 0' 12081 1726882431.00641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882431.00651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882431.00667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882431.00680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882431.00722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882431.00730: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882431.00859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882431.00863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882431.00867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882431.00885: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882431.00888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882431.00891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882431.01002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882431.01005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882431.01008: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882431.01010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882431.01013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882431.01015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882431.01022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882431.01148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882431.03043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882431.03046: stdout chunk (state=3): >>><<< 12081 1726882431.03055: stderr chunk (state=3): >>><<< 12081 1726882431.03068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882431.03075: handler run complete 12081 1726882431.03597: variable 'ansible_facts' from source: unknown 12081 1726882431.03951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.05814: variable 'ansible_facts' from source: unknown 12081 1726882431.06094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.06543: attempt loop complete, returning result 12081 1726882431.06554: _execute() done 12081 1726882431.06557: dumping result to json 12081 1726882431.06690: done dumping result, returning 12081 1726882431.06697: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-0a3f-ff3c-000000000b16] 12081 1726882431.06703: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b16 12081 1726882431.08828: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000b16 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882431.09001: no more pending results, returning what we have 12081 1726882431.09005: results queue empty 12081 1726882431.09006: checking for any_errors_fatal 12081 1726882431.09014: done checking for any_errors_fatal 12081 1726882431.09015: checking for max_fail_percentage 12081 1726882431.09017: done checking for max_fail_percentage 12081 1726882431.09018: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.09019: done checking to see if all hosts have failed 12081 1726882431.09020: getting the remaining hosts for this loop 12081 1726882431.09022: done getting the remaining hosts for this loop 12081 1726882431.09026: getting the next task for host managed_node3 12081 1726882431.09034: done getting next task for host managed_node3 12081 1726882431.09039: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882431.09045: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.09058: getting variables 12081 1726882431.09060: in VariableManager get_vars() 12081 1726882431.09109: Calling all_inventory to load vars for managed_node3 12081 1726882431.09112: Calling groups_inventory to load vars for managed_node3 12081 1726882431.09120: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.09131: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.09133: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.09136: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.09770: WORKER PROCESS EXITING 12081 1726882431.09994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.10970: done with get_vars() 12081 1726882431.10998: done getting variables 12081 1726882431.11049: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:51 -0400 (0:00:00.792) 0:00:50.913 ****** 12081 1726882431.11086: entering _queue_task() for managed_node3/debug 12081 1726882431.11373: worker is 1 (out of 1 available) 12081 1726882431.11386: exiting _queue_task() for managed_node3/debug 12081 1726882431.11399: done queuing things up, now waiting for results queue to drain 12081 1726882431.11400: waiting for pending results... 12081 1726882431.11746: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882431.11953: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a2f 12081 1726882431.11957: variable 'ansible_search_path' from source: unknown 12081 1726882431.11960: variable 'ansible_search_path' from source: unknown 12081 1726882431.11964: calling self._execute() 12081 1726882431.12041: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.12052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.12070: variable 'omit' from source: magic vars 12081 1726882431.12463: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.12483: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.12497: variable 'omit' from source: magic vars 12081 1726882431.12582: variable 'omit' from source: magic vars 12081 1726882431.12680: variable 'network_provider' from source: set_fact 12081 1726882431.12703: variable 'omit' from source: magic vars 12081 1726882431.12755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882431.12797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882431.12825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882431.12850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882431.12868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882431.12904: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882431.12912: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.12920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.13029: Set connection var ansible_pipelining to False 12081 1726882431.13041: Set connection var ansible_shell_type to sh 12081 1726882431.13053: Set connection var ansible_shell_executable to /bin/sh 12081 1726882431.13060: Set connection var ansible_connection to ssh 12081 1726882431.13074: Set connection var ansible_timeout to 10 12081 1726882431.13086: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882431.13116: variable 'ansible_shell_executable' from source: unknown 12081 1726882431.13125: variable 'ansible_connection' from source: unknown 12081 1726882431.13135: variable 'ansible_module_compression' from source: unknown 12081 1726882431.13146: variable 'ansible_shell_type' from source: unknown 12081 1726882431.13153: variable 'ansible_shell_executable' from source: unknown 12081 1726882431.13160: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.13169: variable 'ansible_pipelining' from source: unknown 12081 1726882431.13175: variable 'ansible_timeout' from source: unknown 12081 1726882431.13181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.13325: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882431.13341: variable 'omit' from source: magic vars 12081 1726882431.13344: starting attempt loop 12081 1726882431.13347: running the handler 12081 1726882431.13388: handler run complete 12081 1726882431.13400: attempt loop complete, returning result 12081 1726882431.13403: _execute() done 12081 1726882431.13406: dumping result to json 12081 1726882431.13408: done dumping result, returning 12081 1726882431.13414: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-0a3f-ff3c-000000000a2f] 12081 1726882431.13420: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a2f 12081 1726882431.13612: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a2f 12081 1726882431.13616: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 12081 1726882431.13707: no more pending results, returning what we have 12081 1726882431.13710: results queue empty 12081 1726882431.13711: checking for any_errors_fatal 12081 1726882431.13737: done checking for any_errors_fatal 12081 1726882431.13738: checking for max_fail_percentage 12081 1726882431.13740: done checking for max_fail_percentage 12081 1726882431.13741: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.13741: done checking to see if all hosts have failed 12081 1726882431.13742: getting the remaining hosts for this loop 12081 1726882431.13743: done getting the remaining hosts for this loop 12081 1726882431.13747: getting the next task for host managed_node3 12081 1726882431.13755: done getting next task for host managed_node3 12081 1726882431.13759: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882431.13765: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.13777: getting variables 12081 1726882431.13778: in VariableManager get_vars() 12081 1726882431.13808: Calling all_inventory to load vars for managed_node3 12081 1726882431.13810: Calling groups_inventory to load vars for managed_node3 12081 1726882431.13812: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.13822: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.13825: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.13828: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.15234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.16503: done with get_vars() 12081 1726882431.16532: done getting variables 12081 1726882431.16600: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:51 -0400 (0:00:00.055) 0:00:50.969 ****** 12081 1726882431.16650: entering _queue_task() for managed_node3/fail 12081 1726882431.16992: worker is 1 (out of 1 available) 12081 1726882431.17005: exiting _queue_task() for managed_node3/fail 12081 1726882431.17019: done queuing things up, now waiting for results queue to drain 12081 1726882431.17020: waiting for pending results... 12081 1726882431.17355: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882431.17522: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a30 12081 1726882431.17526: variable 'ansible_search_path' from source: unknown 12081 1726882431.17529: variable 'ansible_search_path' from source: unknown 12081 1726882431.17573: calling self._execute() 12081 1726882431.17791: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.17801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.17810: variable 'omit' from source: magic vars 12081 1726882431.18212: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.18225: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.18355: variable 'network_state' from source: role '' defaults 12081 1726882431.18382: Evaluated conditional (network_state != {}): False 12081 1726882431.18386: when evaluation is False, skipping this task 12081 1726882431.18388: _execute() done 12081 1726882431.18391: dumping result to json 12081 1726882431.18393: done dumping result, returning 12081 1726882431.18397: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-0a3f-ff3c-000000000a30] 12081 1726882431.18399: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a30 12081 1726882431.18498: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a30 12081 1726882431.18502: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882431.18557: no more pending results, returning what we have 12081 1726882431.18562: results queue empty 12081 1726882431.18563: checking for any_errors_fatal 12081 1726882431.18574: done checking for any_errors_fatal 12081 1726882431.18575: checking for max_fail_percentage 12081 1726882431.18577: done checking for max_fail_percentage 12081 1726882431.18578: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.18579: done checking to see if all hosts have failed 12081 1726882431.18580: getting the remaining hosts for this loop 12081 1726882431.18582: done getting the remaining hosts for this loop 12081 1726882431.18587: getting the next task for host managed_node3 12081 1726882431.18595: done getting next task for host managed_node3 12081 1726882431.18600: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882431.18607: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.18631: getting variables 12081 1726882431.18633: in VariableManager get_vars() 12081 1726882431.18687: Calling all_inventory to load vars for managed_node3 12081 1726882431.18690: Calling groups_inventory to load vars for managed_node3 12081 1726882431.18693: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.18707: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.18711: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.18715: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.19757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.21460: done with get_vars() 12081 1726882431.21501: done getting variables 12081 1726882431.21572: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:51 -0400 (0:00:00.049) 0:00:51.019 ****** 12081 1726882431.21611: entering _queue_task() for managed_node3/fail 12081 1726882431.21947: worker is 1 (out of 1 available) 12081 1726882431.21965: exiting _queue_task() for managed_node3/fail 12081 1726882431.21978: done queuing things up, now waiting for results queue to drain 12081 1726882431.21980: waiting for pending results... 12081 1726882431.22281: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882431.22439: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a31 12081 1726882431.22461: variable 'ansible_search_path' from source: unknown 12081 1726882431.22470: variable 'ansible_search_path' from source: unknown 12081 1726882431.22511: calling self._execute() 12081 1726882431.22616: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.22627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.22643: variable 'omit' from source: magic vars 12081 1726882431.23026: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.23044: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.23178: variable 'network_state' from source: role '' defaults 12081 1726882431.23196: Evaluated conditional (network_state != {}): False 12081 1726882431.23203: when evaluation is False, skipping this task 12081 1726882431.23209: _execute() done 12081 1726882431.23215: dumping result to json 12081 1726882431.23222: done dumping result, returning 12081 1726882431.23231: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-0a3f-ff3c-000000000a31] 12081 1726882431.23243: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a31 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882431.23401: no more pending results, returning what we have 12081 1726882431.23406: results queue empty 12081 1726882431.23406: checking for any_errors_fatal 12081 1726882431.23417: done checking for any_errors_fatal 12081 1726882431.23418: checking for max_fail_percentage 12081 1726882431.23420: done checking for max_fail_percentage 12081 1726882431.23421: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.23422: done checking to see if all hosts have failed 12081 1726882431.23423: getting the remaining hosts for this loop 12081 1726882431.23425: done getting the remaining hosts for this loop 12081 1726882431.23429: getting the next task for host managed_node3 12081 1726882431.23438: done getting next task for host managed_node3 12081 1726882431.23442: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882431.23448: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.23479: getting variables 12081 1726882431.23481: in VariableManager get_vars() 12081 1726882431.23525: Calling all_inventory to load vars for managed_node3 12081 1726882431.23528: Calling groups_inventory to load vars for managed_node3 12081 1726882431.23530: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.23544: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.23547: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.23550: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.24483: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a31 12081 1726882431.24487: WORKER PROCESS EXITING 12081 1726882431.25513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.27214: done with get_vars() 12081 1726882431.27246: done getting variables 12081 1726882431.27309: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:51 -0400 (0:00:00.057) 0:00:51.076 ****** 12081 1726882431.27348: entering _queue_task() for managed_node3/fail 12081 1726882431.27687: worker is 1 (out of 1 available) 12081 1726882431.27699: exiting _queue_task() for managed_node3/fail 12081 1726882431.27712: done queuing things up, now waiting for results queue to drain 12081 1726882431.27713: waiting for pending results... 12081 1726882431.28016: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882431.28195: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a32 12081 1726882431.28214: variable 'ansible_search_path' from source: unknown 12081 1726882431.28221: variable 'ansible_search_path' from source: unknown 12081 1726882431.28275: calling self._execute() 12081 1726882431.28380: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.28391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.28402: variable 'omit' from source: magic vars 12081 1726882431.28784: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.28806: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.28998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882431.31483: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882431.31567: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882431.31610: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882431.31659: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882431.31694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882431.31790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.31841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.31882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.31930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.31952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.32067: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.32094: Evaluated conditional (ansible_distribution_major_version | int > 9): False 12081 1726882431.32102: when evaluation is False, skipping this task 12081 1726882431.32110: _execute() done 12081 1726882431.32118: dumping result to json 12081 1726882431.32126: done dumping result, returning 12081 1726882431.32137: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-0a3f-ff3c-000000000a32] 12081 1726882431.32147: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a32 12081 1726882431.32270: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a32 12081 1726882431.32277: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 12081 1726882431.32332: no more pending results, returning what we have 12081 1726882431.32336: results queue empty 12081 1726882431.32337: checking for any_errors_fatal 12081 1726882431.32345: done checking for any_errors_fatal 12081 1726882431.32346: checking for max_fail_percentage 12081 1726882431.32348: done checking for max_fail_percentage 12081 1726882431.32349: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.32350: done checking to see if all hosts have failed 12081 1726882431.32351: getting the remaining hosts for this loop 12081 1726882431.32356: done getting the remaining hosts for this loop 12081 1726882431.32366: getting the next task for host managed_node3 12081 1726882431.32375: done getting next task for host managed_node3 12081 1726882431.32380: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882431.32385: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.32406: getting variables 12081 1726882431.32408: in VariableManager get_vars() 12081 1726882431.32455: Calling all_inventory to load vars for managed_node3 12081 1726882431.32458: Calling groups_inventory to load vars for managed_node3 12081 1726882431.32461: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.32478: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.32482: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.32485: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.33680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.34634: done with get_vars() 12081 1726882431.34655: done getting variables 12081 1726882431.34706: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:51 -0400 (0:00:00.073) 0:00:51.150 ****** 12081 1726882431.34741: entering _queue_task() for managed_node3/dnf 12081 1726882431.35055: worker is 1 (out of 1 available) 12081 1726882431.35073: exiting _queue_task() for managed_node3/dnf 12081 1726882431.35087: done queuing things up, now waiting for results queue to drain 12081 1726882431.35088: waiting for pending results... 12081 1726882431.35402: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882431.35562: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a33 12081 1726882431.35583: variable 'ansible_search_path' from source: unknown 12081 1726882431.35591: variable 'ansible_search_path' from source: unknown 12081 1726882431.35640: calling self._execute() 12081 1726882431.35822: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.35875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.35883: variable 'omit' from source: magic vars 12081 1726882431.36192: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.36200: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.36338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882431.38327: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882431.38391: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882431.38425: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882431.38459: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882431.38486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882431.38565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.38593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.38620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.38668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.38679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.38797: variable 'ansible_distribution' from source: facts 12081 1726882431.38800: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.38816: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12081 1726882431.38929: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882431.39057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.39081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.39105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.39144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.39158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.39210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.39228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.39255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.39294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.39311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.39349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.39377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.39395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.39433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.39446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.39602: variable 'network_connections' from source: task vars 12081 1726882431.39614: variable 'controller_profile' from source: play vars 12081 1726882431.39680: variable 'controller_profile' from source: play vars 12081 1726882431.39689: variable 'controller_device' from source: play vars 12081 1726882431.39750: variable 'controller_device' from source: play vars 12081 1726882431.39757: variable 'dhcp_interface1' from source: play vars 12081 1726882431.39820: variable 'dhcp_interface1' from source: play vars 12081 1726882431.39834: variable 'port1_profile' from source: play vars 12081 1726882431.39894: variable 'port1_profile' from source: play vars 12081 1726882431.39907: variable 'dhcp_interface1' from source: play vars 12081 1726882431.39970: variable 'dhcp_interface1' from source: play vars 12081 1726882431.39982: variable 'controller_profile' from source: play vars 12081 1726882431.40068: variable 'controller_profile' from source: play vars 12081 1726882431.40080: variable 'port2_profile' from source: play vars 12081 1726882431.40140: variable 'port2_profile' from source: play vars 12081 1726882431.40153: variable 'dhcp_interface2' from source: play vars 12081 1726882431.40215: variable 'dhcp_interface2' from source: play vars 12081 1726882431.40235: variable 'controller_profile' from source: play vars 12081 1726882431.40302: variable 'controller_profile' from source: play vars 12081 1726882431.40363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882431.40502: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882431.40533: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882431.40558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882431.40585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882431.40625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882431.40642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882431.40670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.40688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882431.40734: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882431.40890: variable 'network_connections' from source: task vars 12081 1726882431.40893: variable 'controller_profile' from source: play vars 12081 1726882431.40936: variable 'controller_profile' from source: play vars 12081 1726882431.40942: variable 'controller_device' from source: play vars 12081 1726882431.40987: variable 'controller_device' from source: play vars 12081 1726882431.40994: variable 'dhcp_interface1' from source: play vars 12081 1726882431.41036: variable 'dhcp_interface1' from source: play vars 12081 1726882431.41042: variable 'port1_profile' from source: play vars 12081 1726882431.41093: variable 'port1_profile' from source: play vars 12081 1726882431.41096: variable 'dhcp_interface1' from source: play vars 12081 1726882431.41134: variable 'dhcp_interface1' from source: play vars 12081 1726882431.41139: variable 'controller_profile' from source: play vars 12081 1726882431.41183: variable 'controller_profile' from source: play vars 12081 1726882431.41189: variable 'port2_profile' from source: play vars 12081 1726882431.41232: variable 'port2_profile' from source: play vars 12081 1726882431.41237: variable 'dhcp_interface2' from source: play vars 12081 1726882431.41280: variable 'dhcp_interface2' from source: play vars 12081 1726882431.41286: variable 'controller_profile' from source: play vars 12081 1726882431.41328: variable 'controller_profile' from source: play vars 12081 1726882431.41356: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882431.41359: when evaluation is False, skipping this task 12081 1726882431.41362: _execute() done 12081 1726882431.41367: dumping result to json 12081 1726882431.41369: done dumping result, returning 12081 1726882431.41373: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000a33] 12081 1726882431.41380: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a33 12081 1726882431.41475: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a33 12081 1726882431.41478: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882431.41530: no more pending results, returning what we have 12081 1726882431.41534: results queue empty 12081 1726882431.41534: checking for any_errors_fatal 12081 1726882431.41541: done checking for any_errors_fatal 12081 1726882431.41542: checking for max_fail_percentage 12081 1726882431.41543: done checking for max_fail_percentage 12081 1726882431.41544: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.41545: done checking to see if all hosts have failed 12081 1726882431.41546: getting the remaining hosts for this loop 12081 1726882431.41548: done getting the remaining hosts for this loop 12081 1726882431.41552: getting the next task for host managed_node3 12081 1726882431.41562: done getting next task for host managed_node3 12081 1726882431.41569: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882431.41573: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.41594: getting variables 12081 1726882431.41596: in VariableManager get_vars() 12081 1726882431.41636: Calling all_inventory to load vars for managed_node3 12081 1726882431.41638: Calling groups_inventory to load vars for managed_node3 12081 1726882431.41640: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.41651: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.41655: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.41658: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.42873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.44495: done with get_vars() 12081 1726882431.44528: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882431.44618: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:51 -0400 (0:00:00.099) 0:00:51.249 ****** 12081 1726882431.44690: entering _queue_task() for managed_node3/yum 12081 1726882431.45098: worker is 1 (out of 1 available) 12081 1726882431.45113: exiting _queue_task() for managed_node3/yum 12081 1726882431.45127: done queuing things up, now waiting for results queue to drain 12081 1726882431.45128: waiting for pending results... 12081 1726882431.45337: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882431.45443: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a34 12081 1726882431.45455: variable 'ansible_search_path' from source: unknown 12081 1726882431.45467: variable 'ansible_search_path' from source: unknown 12081 1726882431.45500: calling self._execute() 12081 1726882431.45580: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.45583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.45590: variable 'omit' from source: magic vars 12081 1726882431.45865: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.45878: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.46003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882431.48207: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882431.48250: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882431.48284: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882431.48309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882431.48329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882431.48394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.48422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.48440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.48471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.48482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.48554: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.48571: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12081 1726882431.48575: when evaluation is False, skipping this task 12081 1726882431.48578: _execute() done 12081 1726882431.48580: dumping result to json 12081 1726882431.48583: done dumping result, returning 12081 1726882431.48590: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000a34] 12081 1726882431.48597: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a34 12081 1726882431.48689: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a34 12081 1726882431.48692: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12081 1726882431.48758: no more pending results, returning what we have 12081 1726882431.48762: results queue empty 12081 1726882431.48765: checking for any_errors_fatal 12081 1726882431.48773: done checking for any_errors_fatal 12081 1726882431.48773: checking for max_fail_percentage 12081 1726882431.48776: done checking for max_fail_percentage 12081 1726882431.48777: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.48778: done checking to see if all hosts have failed 12081 1726882431.48778: getting the remaining hosts for this loop 12081 1726882431.48780: done getting the remaining hosts for this loop 12081 1726882431.48784: getting the next task for host managed_node3 12081 1726882431.48792: done getting next task for host managed_node3 12081 1726882431.48796: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882431.48801: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.48822: getting variables 12081 1726882431.48824: in VariableManager get_vars() 12081 1726882431.48860: Calling all_inventory to load vars for managed_node3 12081 1726882431.48869: Calling groups_inventory to load vars for managed_node3 12081 1726882431.48873: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.48882: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.48884: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.48887: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.49713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.50768: done with get_vars() 12081 1726882431.50785: done getting variables 12081 1726882431.50830: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:51 -0400 (0:00:00.061) 0:00:51.311 ****** 12081 1726882431.50858: entering _queue_task() for managed_node3/fail 12081 1726882431.51097: worker is 1 (out of 1 available) 12081 1726882431.51110: exiting _queue_task() for managed_node3/fail 12081 1726882431.51123: done queuing things up, now waiting for results queue to drain 12081 1726882431.51125: waiting for pending results... 12081 1726882431.51313: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882431.51413: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a35 12081 1726882431.51423: variable 'ansible_search_path' from source: unknown 12081 1726882431.51428: variable 'ansible_search_path' from source: unknown 12081 1726882431.51459: calling self._execute() 12081 1726882431.51534: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.51539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.51547: variable 'omit' from source: magic vars 12081 1726882431.51821: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.51832: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.51917: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882431.52049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882431.53634: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882431.53685: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882431.53711: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882431.53735: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882431.53758: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882431.53818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.53848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.53871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.53900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.53910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.53940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.53958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.53983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.54008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.54018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.54045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.54065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.54084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.54111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.54121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.54233: variable 'network_connections' from source: task vars 12081 1726882431.54242: variable 'controller_profile' from source: play vars 12081 1726882431.54293: variable 'controller_profile' from source: play vars 12081 1726882431.54301: variable 'controller_device' from source: play vars 12081 1726882431.54348: variable 'controller_device' from source: play vars 12081 1726882431.54359: variable 'dhcp_interface1' from source: play vars 12081 1726882431.54404: variable 'dhcp_interface1' from source: play vars 12081 1726882431.54410: variable 'port1_profile' from source: play vars 12081 1726882431.54458: variable 'port1_profile' from source: play vars 12081 1726882431.54461: variable 'dhcp_interface1' from source: play vars 12081 1726882431.54508: variable 'dhcp_interface1' from source: play vars 12081 1726882431.54514: variable 'controller_profile' from source: play vars 12081 1726882431.54567: variable 'controller_profile' from source: play vars 12081 1726882431.54572: variable 'port2_profile' from source: play vars 12081 1726882431.54614: variable 'port2_profile' from source: play vars 12081 1726882431.54619: variable 'dhcp_interface2' from source: play vars 12081 1726882431.54668: variable 'dhcp_interface2' from source: play vars 12081 1726882431.54675: variable 'controller_profile' from source: play vars 12081 1726882431.54716: variable 'controller_profile' from source: play vars 12081 1726882431.54769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882431.54881: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882431.54908: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882431.54930: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882431.54956: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882431.54987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882431.55002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882431.55018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.55040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882431.55093: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882431.55256: variable 'network_connections' from source: task vars 12081 1726882431.55262: variable 'controller_profile' from source: play vars 12081 1726882431.55309: variable 'controller_profile' from source: play vars 12081 1726882431.55315: variable 'controller_device' from source: play vars 12081 1726882431.55358: variable 'controller_device' from source: play vars 12081 1726882431.55367: variable 'dhcp_interface1' from source: play vars 12081 1726882431.55411: variable 'dhcp_interface1' from source: play vars 12081 1726882431.55417: variable 'port1_profile' from source: play vars 12081 1726882431.55460: variable 'port1_profile' from source: play vars 12081 1726882431.55468: variable 'dhcp_interface1' from source: play vars 12081 1726882431.55511: variable 'dhcp_interface1' from source: play vars 12081 1726882431.55517: variable 'controller_profile' from source: play vars 12081 1726882431.55560: variable 'controller_profile' from source: play vars 12081 1726882431.55567: variable 'port2_profile' from source: play vars 12081 1726882431.55614: variable 'port2_profile' from source: play vars 12081 1726882431.55622: variable 'dhcp_interface2' from source: play vars 12081 1726882431.55667: variable 'dhcp_interface2' from source: play vars 12081 1726882431.55672: variable 'controller_profile' from source: play vars 12081 1726882431.55716: variable 'controller_profile' from source: play vars 12081 1726882431.55741: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882431.55744: when evaluation is False, skipping this task 12081 1726882431.55747: _execute() done 12081 1726882431.55749: dumping result to json 12081 1726882431.55751: done dumping result, returning 12081 1726882431.55759: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000a35] 12081 1726882431.55767: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a35 12081 1726882431.55861: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a35 12081 1726882431.55867: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882431.55912: no more pending results, returning what we have 12081 1726882431.55916: results queue empty 12081 1726882431.55917: checking for any_errors_fatal 12081 1726882431.55925: done checking for any_errors_fatal 12081 1726882431.55925: checking for max_fail_percentage 12081 1726882431.55931: done checking for max_fail_percentage 12081 1726882431.55932: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.55933: done checking to see if all hosts have failed 12081 1726882431.55934: getting the remaining hosts for this loop 12081 1726882431.55936: done getting the remaining hosts for this loop 12081 1726882431.55940: getting the next task for host managed_node3 12081 1726882431.55948: done getting next task for host managed_node3 12081 1726882431.55952: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12081 1726882431.55960: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.55981: getting variables 12081 1726882431.55983: in VariableManager get_vars() 12081 1726882431.56022: Calling all_inventory to load vars for managed_node3 12081 1726882431.56024: Calling groups_inventory to load vars for managed_node3 12081 1726882431.56026: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.56040: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.56043: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.56046: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.56891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.57844: done with get_vars() 12081 1726882431.57869: done getting variables 12081 1726882431.57913: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:51 -0400 (0:00:00.070) 0:00:51.382 ****** 12081 1726882431.57941: entering _queue_task() for managed_node3/package 12081 1726882431.58186: worker is 1 (out of 1 available) 12081 1726882431.58199: exiting _queue_task() for managed_node3/package 12081 1726882431.58213: done queuing things up, now waiting for results queue to drain 12081 1726882431.58215: waiting for pending results... 12081 1726882431.58407: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12081 1726882431.58522: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a36 12081 1726882431.58536: variable 'ansible_search_path' from source: unknown 12081 1726882431.58539: variable 'ansible_search_path' from source: unknown 12081 1726882431.58570: calling self._execute() 12081 1726882431.58644: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.58650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.58659: variable 'omit' from source: magic vars 12081 1726882431.58942: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.58951: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.59095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882431.59294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882431.59327: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882431.59352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882431.59417: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882431.59498: variable 'network_packages' from source: role '' defaults 12081 1726882431.59580: variable '__network_provider_setup' from source: role '' defaults 12081 1726882431.59588: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882431.59635: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882431.59642: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882431.59688: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882431.59806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882431.61242: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882431.61298: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882431.61324: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882431.61346: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882431.61378: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882431.61436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.61457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.61484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.61511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.61521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.61551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.61572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.61591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.61616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.61626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.61777: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882431.61851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.61872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.61889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.61918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.61929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.61994: variable 'ansible_python' from source: facts 12081 1726882431.62009: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882431.62068: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882431.62126: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882431.62209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.62228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.62247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.62277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.62287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.62319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.62343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.62362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.62388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.62398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.62498: variable 'network_connections' from source: task vars 12081 1726882431.62502: variable 'controller_profile' from source: play vars 12081 1726882431.62579: variable 'controller_profile' from source: play vars 12081 1726882431.62589: variable 'controller_device' from source: play vars 12081 1726882431.62661: variable 'controller_device' from source: play vars 12081 1726882431.62676: variable 'dhcp_interface1' from source: play vars 12081 1726882431.63002: variable 'dhcp_interface1' from source: play vars 12081 1726882431.63011: variable 'port1_profile' from source: play vars 12081 1726882431.63087: variable 'port1_profile' from source: play vars 12081 1726882431.63100: variable 'dhcp_interface1' from source: play vars 12081 1726882431.63171: variable 'dhcp_interface1' from source: play vars 12081 1726882431.63179: variable 'controller_profile' from source: play vars 12081 1726882431.63271: variable 'controller_profile' from source: play vars 12081 1726882431.63279: variable 'port2_profile' from source: play vars 12081 1726882431.63350: variable 'port2_profile' from source: play vars 12081 1726882431.63360: variable 'dhcp_interface2' from source: play vars 12081 1726882431.63433: variable 'dhcp_interface2' from source: play vars 12081 1726882431.63440: variable 'controller_profile' from source: play vars 12081 1726882431.63515: variable 'controller_profile' from source: play vars 12081 1726882431.63574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882431.63595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882431.63616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.63643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882431.63681: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882431.63868: variable 'network_connections' from source: task vars 12081 1726882431.63873: variable 'controller_profile' from source: play vars 12081 1726882431.63944: variable 'controller_profile' from source: play vars 12081 1726882431.63951: variable 'controller_device' from source: play vars 12081 1726882431.64026: variable 'controller_device' from source: play vars 12081 1726882431.64035: variable 'dhcp_interface1' from source: play vars 12081 1726882431.64090: variable 'dhcp_interface1' from source: play vars 12081 1726882431.64099: variable 'port1_profile' from source: play vars 12081 1726882431.64172: variable 'port1_profile' from source: play vars 12081 1726882431.64180: variable 'dhcp_interface1' from source: play vars 12081 1726882431.64251: variable 'dhcp_interface1' from source: play vars 12081 1726882431.64261: variable 'controller_profile' from source: play vars 12081 1726882431.64333: variable 'controller_profile' from source: play vars 12081 1726882431.64342: variable 'port2_profile' from source: play vars 12081 1726882431.64968: variable 'port2_profile' from source: play vars 12081 1726882431.64971: variable 'dhcp_interface2' from source: play vars 12081 1726882431.64974: variable 'dhcp_interface2' from source: play vars 12081 1726882431.64976: variable 'controller_profile' from source: play vars 12081 1726882431.64977: variable 'controller_profile' from source: play vars 12081 1726882431.64979: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882431.64981: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882431.65109: variable 'network_connections' from source: task vars 12081 1726882431.65118: variable 'controller_profile' from source: play vars 12081 1726882431.65213: variable 'controller_profile' from source: play vars 12081 1726882431.65225: variable 'controller_device' from source: play vars 12081 1726882431.65293: variable 'controller_device' from source: play vars 12081 1726882431.65307: variable 'dhcp_interface1' from source: play vars 12081 1726882431.65380: variable 'dhcp_interface1' from source: play vars 12081 1726882431.65394: variable 'port1_profile' from source: play vars 12081 1726882431.65459: variable 'port1_profile' from source: play vars 12081 1726882431.65480: variable 'dhcp_interface1' from source: play vars 12081 1726882431.65544: variable 'dhcp_interface1' from source: play vars 12081 1726882431.65559: variable 'controller_profile' from source: play vars 12081 1726882431.65626: variable 'controller_profile' from source: play vars 12081 1726882431.65636: variable 'port2_profile' from source: play vars 12081 1726882431.65704: variable 'port2_profile' from source: play vars 12081 1726882431.65716: variable 'dhcp_interface2' from source: play vars 12081 1726882431.65785: variable 'dhcp_interface2' from source: play vars 12081 1726882431.65790: variable 'controller_profile' from source: play vars 12081 1726882431.65834: variable 'controller_profile' from source: play vars 12081 1726882431.65880: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882431.65932: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882431.66148: variable 'network_connections' from source: task vars 12081 1726882431.66151: variable 'controller_profile' from source: play vars 12081 1726882431.66202: variable 'controller_profile' from source: play vars 12081 1726882431.66208: variable 'controller_device' from source: play vars 12081 1726882431.66252: variable 'controller_device' from source: play vars 12081 1726882431.66261: variable 'dhcp_interface1' from source: play vars 12081 1726882431.66313: variable 'dhcp_interface1' from source: play vars 12081 1726882431.66320: variable 'port1_profile' from source: play vars 12081 1726882431.66368: variable 'port1_profile' from source: play vars 12081 1726882431.66373: variable 'dhcp_interface1' from source: play vars 12081 1726882431.66420: variable 'dhcp_interface1' from source: play vars 12081 1726882431.66425: variable 'controller_profile' from source: play vars 12081 1726882431.66473: variable 'controller_profile' from source: play vars 12081 1726882431.66479: variable 'port2_profile' from source: play vars 12081 1726882431.66527: variable 'port2_profile' from source: play vars 12081 1726882431.66532: variable 'dhcp_interface2' from source: play vars 12081 1726882431.66580: variable 'dhcp_interface2' from source: play vars 12081 1726882431.66585: variable 'controller_profile' from source: play vars 12081 1726882431.66633: variable 'controller_profile' from source: play vars 12081 1726882431.66683: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882431.66727: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882431.66735: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882431.66778: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882431.66915: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882431.67227: variable 'network_connections' from source: task vars 12081 1726882431.67230: variable 'controller_profile' from source: play vars 12081 1726882431.67278: variable 'controller_profile' from source: play vars 12081 1726882431.67284: variable 'controller_device' from source: play vars 12081 1726882431.67324: variable 'controller_device' from source: play vars 12081 1726882431.67331: variable 'dhcp_interface1' from source: play vars 12081 1726882431.67379: variable 'dhcp_interface1' from source: play vars 12081 1726882431.67382: variable 'port1_profile' from source: play vars 12081 1726882431.67423: variable 'port1_profile' from source: play vars 12081 1726882431.67429: variable 'dhcp_interface1' from source: play vars 12081 1726882431.67477: variable 'dhcp_interface1' from source: play vars 12081 1726882431.67480: variable 'controller_profile' from source: play vars 12081 1726882431.67521: variable 'controller_profile' from source: play vars 12081 1726882431.67527: variable 'port2_profile' from source: play vars 12081 1726882431.67571: variable 'port2_profile' from source: play vars 12081 1726882431.67582: variable 'dhcp_interface2' from source: play vars 12081 1726882431.67620: variable 'dhcp_interface2' from source: play vars 12081 1726882431.67626: variable 'controller_profile' from source: play vars 12081 1726882431.67672: variable 'controller_profile' from source: play vars 12081 1726882431.67678: variable 'ansible_distribution' from source: facts 12081 1726882431.67681: variable '__network_rh_distros' from source: role '' defaults 12081 1726882431.67687: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.67729: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882431.67838: variable 'ansible_distribution' from source: facts 12081 1726882431.67842: variable '__network_rh_distros' from source: role '' defaults 12081 1726882431.67847: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.67858: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882431.67968: variable 'ansible_distribution' from source: facts 12081 1726882431.67971: variable '__network_rh_distros' from source: role '' defaults 12081 1726882431.67976: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.68001: variable 'network_provider' from source: set_fact 12081 1726882431.68013: variable 'ansible_facts' from source: unknown 12081 1726882431.68594: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12081 1726882431.68602: when evaluation is False, skipping this task 12081 1726882431.68608: _execute() done 12081 1726882431.68615: dumping result to json 12081 1726882431.68621: done dumping result, returning 12081 1726882431.68632: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-0a3f-ff3c-000000000a36] 12081 1726882431.68642: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a36 12081 1726882431.68765: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a36 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12081 1726882431.68817: no more pending results, returning what we have 12081 1726882431.68820: results queue empty 12081 1726882431.68821: checking for any_errors_fatal 12081 1726882431.68827: done checking for any_errors_fatal 12081 1726882431.68828: checking for max_fail_percentage 12081 1726882431.68830: done checking for max_fail_percentage 12081 1726882431.68831: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.68832: done checking to see if all hosts have failed 12081 1726882431.68833: getting the remaining hosts for this loop 12081 1726882431.68834: done getting the remaining hosts for this loop 12081 1726882431.68838: getting the next task for host managed_node3 12081 1726882431.68845: done getting next task for host managed_node3 12081 1726882431.68849: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882431.68854: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.68875: getting variables 12081 1726882431.68877: in VariableManager get_vars() 12081 1726882431.68918: Calling all_inventory to load vars for managed_node3 12081 1726882431.68921: Calling groups_inventory to load vars for managed_node3 12081 1726882431.68923: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.68933: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.68936: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.68938: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.69933: WORKER PROCESS EXITING 12081 1726882431.70903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.72695: done with get_vars() 12081 1726882431.72727: done getting variables 12081 1726882431.72796: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:51 -0400 (0:00:00.148) 0:00:51.531 ****** 12081 1726882431.72834: entering _queue_task() for managed_node3/package 12081 1726882431.73196: worker is 1 (out of 1 available) 12081 1726882431.73208: exiting _queue_task() for managed_node3/package 12081 1726882431.73221: done queuing things up, now waiting for results queue to drain 12081 1726882431.73222: waiting for pending results... 12081 1726882431.73530: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882431.73718: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a37 12081 1726882431.73739: variable 'ansible_search_path' from source: unknown 12081 1726882431.73748: variable 'ansible_search_path' from source: unknown 12081 1726882431.73801: calling self._execute() 12081 1726882431.73910: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.73922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.73935: variable 'omit' from source: magic vars 12081 1726882431.74340: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.74366: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.74502: variable 'network_state' from source: role '' defaults 12081 1726882431.74515: Evaluated conditional (network_state != {}): False 12081 1726882431.74520: when evaluation is False, skipping this task 12081 1726882431.74526: _execute() done 12081 1726882431.74531: dumping result to json 12081 1726882431.74542: done dumping result, returning 12081 1726882431.74555: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-000000000a37] 12081 1726882431.74569: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a37 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882431.74727: no more pending results, returning what we have 12081 1726882431.74732: results queue empty 12081 1726882431.74733: checking for any_errors_fatal 12081 1726882431.74743: done checking for any_errors_fatal 12081 1726882431.74744: checking for max_fail_percentage 12081 1726882431.74746: done checking for max_fail_percentage 12081 1726882431.74747: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.74748: done checking to see if all hosts have failed 12081 1726882431.74749: getting the remaining hosts for this loop 12081 1726882431.74751: done getting the remaining hosts for this loop 12081 1726882431.74758: getting the next task for host managed_node3 12081 1726882431.74770: done getting next task for host managed_node3 12081 1726882431.74776: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882431.74781: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.74804: getting variables 12081 1726882431.74806: in VariableManager get_vars() 12081 1726882431.74847: Calling all_inventory to load vars for managed_node3 12081 1726882431.74850: Calling groups_inventory to load vars for managed_node3 12081 1726882431.74852: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.74868: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.74871: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.74873: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.75881: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a37 12081 1726882431.75884: WORKER PROCESS EXITING 12081 1726882431.76319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.78778: done with get_vars() 12081 1726882431.78805: done getting variables 12081 1726882431.78868: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:51 -0400 (0:00:00.060) 0:00:51.592 ****** 12081 1726882431.78906: entering _queue_task() for managed_node3/package 12081 1726882431.79276: worker is 1 (out of 1 available) 12081 1726882431.79287: exiting _queue_task() for managed_node3/package 12081 1726882431.79299: done queuing things up, now waiting for results queue to drain 12081 1726882431.79300: waiting for pending results... 12081 1726882431.80629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882431.81136: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a38 12081 1726882431.81414: variable 'ansible_search_path' from source: unknown 12081 1726882431.81422: variable 'ansible_search_path' from source: unknown 12081 1726882431.81468: calling self._execute() 12081 1726882431.81793: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.81804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.81817: variable 'omit' from source: magic vars 12081 1726882431.82926: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.82943: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.83185: variable 'network_state' from source: role '' defaults 12081 1726882431.83262: Evaluated conditional (network_state != {}): False 12081 1726882431.83273: when evaluation is False, skipping this task 12081 1726882431.83279: _execute() done 12081 1726882431.83285: dumping result to json 12081 1726882431.83291: done dumping result, returning 12081 1726882431.83303: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-000000000a38] 12081 1726882431.83315: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a38 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882431.83482: no more pending results, returning what we have 12081 1726882431.83486: results queue empty 12081 1726882431.83487: checking for any_errors_fatal 12081 1726882431.83495: done checking for any_errors_fatal 12081 1726882431.83496: checking for max_fail_percentage 12081 1726882431.83498: done checking for max_fail_percentage 12081 1726882431.83499: checking to see if all hosts have failed and the running result is not ok 12081 1726882431.83500: done checking to see if all hosts have failed 12081 1726882431.83501: getting the remaining hosts for this loop 12081 1726882431.83503: done getting the remaining hosts for this loop 12081 1726882431.83507: getting the next task for host managed_node3 12081 1726882431.83517: done getting next task for host managed_node3 12081 1726882431.83521: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882431.83528: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882431.83552: getting variables 12081 1726882431.83557: in VariableManager get_vars() 12081 1726882431.83603: Calling all_inventory to load vars for managed_node3 12081 1726882431.83606: Calling groups_inventory to load vars for managed_node3 12081 1726882431.83608: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882431.83621: Calling all_plugins_play to load vars for managed_node3 12081 1726882431.83624: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882431.83627: Calling groups_plugins_play to load vars for managed_node3 12081 1726882431.85085: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a38 12081 1726882431.85089: WORKER PROCESS EXITING 12081 1726882431.86420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882431.88151: done with get_vars() 12081 1726882431.88180: done getting variables 12081 1726882431.89189: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:51 -0400 (0:00:00.103) 0:00:51.695 ****** 12081 1726882431.89228: entering _queue_task() for managed_node3/service 12081 1726882431.89552: worker is 1 (out of 1 available) 12081 1726882431.89567: exiting _queue_task() for managed_node3/service 12081 1726882431.89583: done queuing things up, now waiting for results queue to drain 12081 1726882431.89584: waiting for pending results... 12081 1726882431.90186: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882431.90605: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a39 12081 1726882431.90622: variable 'ansible_search_path' from source: unknown 12081 1726882431.90629: variable 'ansible_search_path' from source: unknown 12081 1726882431.90673: calling self._execute() 12081 1726882431.90799: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882431.90928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882431.90941: variable 'omit' from source: magic vars 12081 1726882431.91698: variable 'ansible_distribution_major_version' from source: facts 12081 1726882431.91715: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882431.91996: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882431.92318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882431.97079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882431.97157: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882431.97200: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882431.97244: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882431.97290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882431.97377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.97424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.97474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.97525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.97549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.97600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.97639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.97680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.97730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.97749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.97800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882431.97832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882431.97864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882431.97912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882431.97933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882431.98135: variable 'network_connections' from source: task vars 12081 1726882431.98157: variable 'controller_profile' from source: play vars 12081 1726882431.98236: variable 'controller_profile' from source: play vars 12081 1726882431.98255: variable 'controller_device' from source: play vars 12081 1726882431.98329: variable 'controller_device' from source: play vars 12081 1726882431.98345: variable 'dhcp_interface1' from source: play vars 12081 1726882431.98417: variable 'dhcp_interface1' from source: play vars 12081 1726882431.98430: variable 'port1_profile' from source: play vars 12081 1726882431.98500: variable 'port1_profile' from source: play vars 12081 1726882431.98512: variable 'dhcp_interface1' from source: play vars 12081 1726882431.98581: variable 'dhcp_interface1' from source: play vars 12081 1726882431.98597: variable 'controller_profile' from source: play vars 12081 1726882431.98667: variable 'controller_profile' from source: play vars 12081 1726882431.98679: variable 'port2_profile' from source: play vars 12081 1726882431.98749: variable 'port2_profile' from source: play vars 12081 1726882431.98766: variable 'dhcp_interface2' from source: play vars 12081 1726882431.98832: variable 'dhcp_interface2' from source: play vars 12081 1726882431.98855: variable 'controller_profile' from source: play vars 12081 1726882431.98923: variable 'controller_profile' from source: play vars 12081 1726882431.99006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882431.99921: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882432.00016: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882432.00057: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882432.00157: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882432.00216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882432.00359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882432.00393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882432.00434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882432.00519: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882432.00813: variable 'network_connections' from source: task vars 12081 1726882432.00824: variable 'controller_profile' from source: play vars 12081 1726882432.00897: variable 'controller_profile' from source: play vars 12081 1726882432.00910: variable 'controller_device' from source: play vars 12081 1726882432.00976: variable 'controller_device' from source: play vars 12081 1726882432.00991: variable 'dhcp_interface1' from source: play vars 12081 1726882432.01061: variable 'dhcp_interface1' from source: play vars 12081 1726882432.01078: variable 'port1_profile' from source: play vars 12081 1726882432.01145: variable 'port1_profile' from source: play vars 12081 1726882432.01160: variable 'dhcp_interface1' from source: play vars 12081 1726882432.01252: variable 'dhcp_interface1' from source: play vars 12081 1726882432.01268: variable 'controller_profile' from source: play vars 12081 1726882432.01335: variable 'controller_profile' from source: play vars 12081 1726882432.01346: variable 'port2_profile' from source: play vars 12081 1726882432.01410: variable 'port2_profile' from source: play vars 12081 1726882432.01441: variable 'dhcp_interface2' from source: play vars 12081 1726882432.01507: variable 'dhcp_interface2' from source: play vars 12081 1726882432.01519: variable 'controller_profile' from source: play vars 12081 1726882432.01593: variable 'controller_profile' from source: play vars 12081 1726882432.01631: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882432.01639: when evaluation is False, skipping this task 12081 1726882432.01650: _execute() done 12081 1726882432.01665: dumping result to json 12081 1726882432.01675: done dumping result, returning 12081 1726882432.01687: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000a39] 12081 1726882432.01697: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a39 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882432.01856: no more pending results, returning what we have 12081 1726882432.01861: results queue empty 12081 1726882432.01863: checking for any_errors_fatal 12081 1726882432.01872: done checking for any_errors_fatal 12081 1726882432.01873: checking for max_fail_percentage 12081 1726882432.01875: done checking for max_fail_percentage 12081 1726882432.01876: checking to see if all hosts have failed and the running result is not ok 12081 1726882432.01877: done checking to see if all hosts have failed 12081 1726882432.01878: getting the remaining hosts for this loop 12081 1726882432.01880: done getting the remaining hosts for this loop 12081 1726882432.01885: getting the next task for host managed_node3 12081 1726882432.01895: done getting next task for host managed_node3 12081 1726882432.01900: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882432.01906: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882432.01930: getting variables 12081 1726882432.01933: in VariableManager get_vars() 12081 1726882432.01982: Calling all_inventory to load vars for managed_node3 12081 1726882432.01985: Calling groups_inventory to load vars for managed_node3 12081 1726882432.01988: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882432.02000: Calling all_plugins_play to load vars for managed_node3 12081 1726882432.02003: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882432.02005: Calling groups_plugins_play to load vars for managed_node3 12081 1726882432.03004: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a39 12081 1726882432.03007: WORKER PROCESS EXITING 12081 1726882432.03897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882432.06903: done with get_vars() 12081 1726882432.06938: done getting variables 12081 1726882432.07003: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:52 -0400 (0:00:00.178) 0:00:51.873 ****** 12081 1726882432.07038: entering _queue_task() for managed_node3/service 12081 1726882432.07379: worker is 1 (out of 1 available) 12081 1726882432.07391: exiting _queue_task() for managed_node3/service 12081 1726882432.07406: done queuing things up, now waiting for results queue to drain 12081 1726882432.07407: waiting for pending results... 12081 1726882432.07737: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882432.08102: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a3a 12081 1726882432.08123: variable 'ansible_search_path' from source: unknown 12081 1726882432.08131: variable 'ansible_search_path' from source: unknown 12081 1726882432.08185: calling self._execute() 12081 1726882432.08300: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882432.08312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882432.08326: variable 'omit' from source: magic vars 12081 1726882432.08715: variable 'ansible_distribution_major_version' from source: facts 12081 1726882432.08733: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882432.08882: variable 'network_provider' from source: set_fact 12081 1726882432.08892: variable 'network_state' from source: role '' defaults 12081 1726882432.08911: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12081 1726882432.08923: variable 'omit' from source: magic vars 12081 1726882432.09011: variable 'omit' from source: magic vars 12081 1726882432.09045: variable 'network_service_name' from source: role '' defaults 12081 1726882432.09117: variable 'network_service_name' from source: role '' defaults 12081 1726882432.09239: variable '__network_provider_setup' from source: role '' defaults 12081 1726882432.09251: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882432.09322: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882432.09335: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882432.09409: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882432.09652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882432.22345: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882432.22418: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882432.22473: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882432.22515: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882432.22543: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882432.22621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882432.22655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882432.22687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882432.22737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882432.22757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882432.22804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882432.22837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882432.22871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882432.22913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882432.22930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882432.23180: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882432.23307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882432.23337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882432.23372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882432.23421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882432.23440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882432.23545: variable 'ansible_python' from source: facts 12081 1726882432.23593: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882432.23706: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882432.24457: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882432.24804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882432.24867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882432.24977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882432.25023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882432.25075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882432.25212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882432.25249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882432.25298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882432.25423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882432.25443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882432.25847: variable 'network_connections' from source: task vars 12081 1726882432.25867: variable 'controller_profile' from source: play vars 12081 1726882432.25952: variable 'controller_profile' from source: play vars 12081 1726882432.26049: variable 'controller_device' from source: play vars 12081 1726882432.26130: variable 'controller_device' from source: play vars 12081 1726882432.26275: variable 'dhcp_interface1' from source: play vars 12081 1726882432.26347: variable 'dhcp_interface1' from source: play vars 12081 1726882432.26491: variable 'port1_profile' from source: play vars 12081 1726882432.26571: variable 'port1_profile' from source: play vars 12081 1726882432.26604: variable 'dhcp_interface1' from source: play vars 12081 1726882432.26799: variable 'dhcp_interface1' from source: play vars 12081 1726882432.26821: variable 'controller_profile' from source: play vars 12081 1726882432.26985: variable 'controller_profile' from source: play vars 12081 1726882432.27000: variable 'port2_profile' from source: play vars 12081 1726882432.27092: variable 'port2_profile' from source: play vars 12081 1726882432.27181: variable 'dhcp_interface2' from source: play vars 12081 1726882432.27310: variable 'dhcp_interface2' from source: play vars 12081 1726882432.27474: variable 'controller_profile' from source: play vars 12081 1726882432.27547: variable 'controller_profile' from source: play vars 12081 1726882432.27800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882432.28339: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882432.28405: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882432.28577: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882432.28621: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882432.28804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882432.28837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882432.29000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882432.29037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882432.29090: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882432.29744: variable 'network_connections' from source: task vars 12081 1726882432.29765: variable 'controller_profile' from source: play vars 12081 1726882432.29938: variable 'controller_profile' from source: play vars 12081 1726882432.30084: variable 'controller_device' from source: play vars 12081 1726882432.30187: variable 'controller_device' from source: play vars 12081 1726882432.30208: variable 'dhcp_interface1' from source: play vars 12081 1726882432.30360: variable 'dhcp_interface1' from source: play vars 12081 1726882432.30416: variable 'port1_profile' from source: play vars 12081 1726882432.30584: variable 'port1_profile' from source: play vars 12081 1726882432.30732: variable 'dhcp_interface1' from source: play vars 12081 1726882432.30806: variable 'dhcp_interface1' from source: play vars 12081 1726882432.30825: variable 'controller_profile' from source: play vars 12081 1726882432.30904: variable 'controller_profile' from source: play vars 12081 1726882432.31061: variable 'port2_profile' from source: play vars 12081 1726882432.31132: variable 'port2_profile' from source: play vars 12081 1726882432.31281: variable 'dhcp_interface2' from source: play vars 12081 1726882432.31359: variable 'dhcp_interface2' from source: play vars 12081 1726882432.31394: variable 'controller_profile' from source: play vars 12081 1726882432.31567: variable 'controller_profile' from source: play vars 12081 1726882432.31749: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882432.31843: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882432.32429: variable 'network_connections' from source: task vars 12081 1726882432.32582: variable 'controller_profile' from source: play vars 12081 1726882432.32652: variable 'controller_profile' from source: play vars 12081 1726882432.32670: variable 'controller_device' from source: play vars 12081 1726882432.32850: variable 'controller_device' from source: play vars 12081 1726882432.32869: variable 'dhcp_interface1' from source: play vars 12081 1726882432.33071: variable 'dhcp_interface1' from source: play vars 12081 1726882432.33086: variable 'port1_profile' from source: play vars 12081 1726882432.33182: variable 'port1_profile' from source: play vars 12081 1726882432.33345: variable 'dhcp_interface1' from source: play vars 12081 1726882432.33428: variable 'dhcp_interface1' from source: play vars 12081 1726882432.33557: variable 'controller_profile' from source: play vars 12081 1726882432.33633: variable 'controller_profile' from source: play vars 12081 1726882432.33646: variable 'port2_profile' from source: play vars 12081 1726882432.33840: variable 'port2_profile' from source: play vars 12081 1726882432.33858: variable 'dhcp_interface2' from source: play vars 12081 1726882432.34051: variable 'dhcp_interface2' from source: play vars 12081 1726882432.34070: variable 'controller_profile' from source: play vars 12081 1726882432.34167: variable 'controller_profile' from source: play vars 12081 1726882432.34349: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882432.34555: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882432.35101: variable 'network_connections' from source: task vars 12081 1726882432.35202: variable 'controller_profile' from source: play vars 12081 1726882432.35285: variable 'controller_profile' from source: play vars 12081 1726882432.35319: variable 'controller_device' from source: play vars 12081 1726882432.35550: variable 'controller_device' from source: play vars 12081 1726882432.35570: variable 'dhcp_interface1' from source: play vars 12081 1726882432.35677: variable 'dhcp_interface1' from source: play vars 12081 1726882432.35749: variable 'port1_profile' from source: play vars 12081 1726882432.35827: variable 'port1_profile' from source: play vars 12081 1726882432.35970: variable 'dhcp_interface1' from source: play vars 12081 1726882432.36044: variable 'dhcp_interface1' from source: play vars 12081 1726882432.36183: variable 'controller_profile' from source: play vars 12081 1726882432.36257: variable 'controller_profile' from source: play vars 12081 1726882432.36391: variable 'port2_profile' from source: play vars 12081 1726882432.36471: variable 'port2_profile' from source: play vars 12081 1726882432.36483: variable 'dhcp_interface2' from source: play vars 12081 1726882432.36576: variable 'dhcp_interface2' from source: play vars 12081 1726882432.36727: variable 'controller_profile' from source: play vars 12081 1726882432.36803: variable 'controller_profile' from source: play vars 12081 1726882432.36907: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882432.37114: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882432.37164: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882432.37227: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882432.37876: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882432.38956: variable 'network_connections' from source: task vars 12081 1726882432.39019: variable 'controller_profile' from source: play vars 12081 1726882432.39091: variable 'controller_profile' from source: play vars 12081 1726882432.39237: variable 'controller_device' from source: play vars 12081 1726882432.39305: variable 'controller_device' from source: play vars 12081 1726882432.39445: variable 'dhcp_interface1' from source: play vars 12081 1726882432.39517: variable 'dhcp_interface1' from source: play vars 12081 1726882432.39529: variable 'port1_profile' from source: play vars 12081 1726882432.39706: variable 'port1_profile' from source: play vars 12081 1726882432.39717: variable 'dhcp_interface1' from source: play vars 12081 1726882432.39795: variable 'dhcp_interface1' from source: play vars 12081 1726882432.39893: variable 'controller_profile' from source: play vars 12081 1726882432.39957: variable 'controller_profile' from source: play vars 12081 1726882432.40001: variable 'port2_profile' from source: play vars 12081 1726882432.40067: variable 'port2_profile' from source: play vars 12081 1726882432.40219: variable 'dhcp_interface2' from source: play vars 12081 1726882432.40285: variable 'dhcp_interface2' from source: play vars 12081 1726882432.40297: variable 'controller_profile' from source: play vars 12081 1726882432.40482: variable 'controller_profile' from source: play vars 12081 1726882432.40495: variable 'ansible_distribution' from source: facts 12081 1726882432.40503: variable '__network_rh_distros' from source: role '' defaults 12081 1726882432.40511: variable 'ansible_distribution_major_version' from source: facts 12081 1726882432.40652: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882432.40852: variable 'ansible_distribution' from source: facts 12081 1726882432.40985: variable '__network_rh_distros' from source: role '' defaults 12081 1726882432.40996: variable 'ansible_distribution_major_version' from source: facts 12081 1726882432.41012: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882432.41362: variable 'ansible_distribution' from source: facts 12081 1726882432.41419: variable '__network_rh_distros' from source: role '' defaults 12081 1726882432.41430: variable 'ansible_distribution_major_version' from source: facts 12081 1726882432.41479: variable 'network_provider' from source: set_fact 12081 1726882432.41651: variable 'omit' from source: magic vars 12081 1726882432.41688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882432.41716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882432.41739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882432.41769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882432.41865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882432.41892: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882432.41900: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882432.41908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882432.42132: Set connection var ansible_pipelining to False 12081 1726882432.42140: Set connection var ansible_shell_type to sh 12081 1726882432.42186: Set connection var ansible_shell_executable to /bin/sh 12081 1726882432.42193: Set connection var ansible_connection to ssh 12081 1726882432.42203: Set connection var ansible_timeout to 10 12081 1726882432.42294: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882432.42326: variable 'ansible_shell_executable' from source: unknown 12081 1726882432.42334: variable 'ansible_connection' from source: unknown 12081 1726882432.42342: variable 'ansible_module_compression' from source: unknown 12081 1726882432.42349: variable 'ansible_shell_type' from source: unknown 12081 1726882432.42358: variable 'ansible_shell_executable' from source: unknown 12081 1726882432.42368: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882432.42377: variable 'ansible_pipelining' from source: unknown 12081 1726882432.42385: variable 'ansible_timeout' from source: unknown 12081 1726882432.42406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882432.42608: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882432.42681: variable 'omit' from source: magic vars 12081 1726882432.42691: starting attempt loop 12081 1726882432.42697: running the handler 12081 1726882432.42899: variable 'ansible_facts' from source: unknown 12081 1726882432.44504: _low_level_execute_command(): starting 12081 1726882432.44515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882432.46259: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882432.46266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882432.46291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.46295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882432.46297: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.46349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882432.46990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882432.46998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882432.47115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882432.49068: stdout chunk (state=3): >>>/root <<< 12081 1726882432.49175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882432.49261: stderr chunk (state=3): >>><<< 12081 1726882432.49267: stdout chunk (state=3): >>><<< 12081 1726882432.49384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882432.49388: _low_level_execute_command(): starting 12081 1726882432.49391: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333 `" && echo ansible-tmp-1726882432.4928854-14429-173485536295333="` echo /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333 `" ) && sleep 0' 12081 1726882432.50210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882432.50214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882432.50257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882432.50260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882432.50263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882432.50267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.50315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882432.51588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882432.51597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882432.51702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882432.53689: stdout chunk (state=3): >>>ansible-tmp-1726882432.4928854-14429-173485536295333=/root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333 <<< 12081 1726882432.53803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882432.53882: stderr chunk (state=3): >>><<< 12081 1726882432.53885: stdout chunk (state=3): >>><<< 12081 1726882432.53904: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882432.4928854-14429-173485536295333=/root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882432.53936: variable 'ansible_module_compression' from source: unknown 12081 1726882432.53987: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12081 1726882432.54042: variable 'ansible_facts' from source: unknown 12081 1726882432.54483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333/AnsiballZ_systemd.py 12081 1726882432.54641: Sending initial data 12081 1726882432.54644: Sent initial data (156 bytes) 12081 1726882432.55840: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882432.55857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882432.55874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882432.55896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882432.55942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882432.55958: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882432.55974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.55993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882432.56014: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882432.56025: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882432.56036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882432.56048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882432.56071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882432.56088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882432.56107: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882432.56127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.56215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882432.56240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882432.56257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882432.56649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882432.58177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882432.58278: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882432.58381: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpdp_s5xz3 /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333/AnsiballZ_systemd.py <<< 12081 1726882432.58480: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882432.61488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882432.61579: stderr chunk (state=3): >>><<< 12081 1726882432.61605: stdout chunk (state=3): >>><<< 12081 1726882432.61608: done transferring module to remote 12081 1726882432.61631: _low_level_execute_command(): starting 12081 1726882432.61634: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333/ /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333/AnsiballZ_systemd.py && sleep 0' 12081 1726882432.62343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882432.62350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882432.62384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.62393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882432.62408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882432.62411: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.62477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882432.62479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882432.62481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882432.62577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882432.64324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882432.64391: stderr chunk (state=3): >>><<< 12081 1726882432.64395: stdout chunk (state=3): >>><<< 12081 1726882432.64501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882432.64505: _low_level_execute_command(): starting 12081 1726882432.64507: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333/AnsiballZ_systemd.py && sleep 0' 12081 1726882432.65149: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882432.65189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882432.65193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.65245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882432.65252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882432.65255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882432.65358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882432.90146: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 12081 1726882432.90186: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "13975552", "MemoryAvailable": "infinity", "CPUUsageNSec": "980682000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 12081 1726882432.90196: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12081 1726882432.91792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882432.91796: stdout chunk (state=3): >>><<< 12081 1726882432.91799: stderr chunk (state=3): >>><<< 12081 1726882432.91804: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13975552", "MemoryAvailable": "infinity", "CPUUsageNSec": "980682000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882432.91975: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882432.92067: _low_level_execute_command(): starting 12081 1726882432.92071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882432.4928854-14429-173485536295333/ > /dev/null 2>&1 && sleep 0' 12081 1726882432.93029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882432.93034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882432.93079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882432.93082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.93097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882432.93102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882432.93114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882432.93119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882432.93207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882432.93220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882432.93229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882432.93350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882432.95223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882432.95256: stderr chunk (state=3): >>><<< 12081 1726882432.95260: stdout chunk (state=3): >>><<< 12081 1726882432.95615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882432.95619: handler run complete 12081 1726882432.95622: attempt loop complete, returning result 12081 1726882432.95624: _execute() done 12081 1726882432.95626: dumping result to json 12081 1726882432.95628: done dumping result, returning 12081 1726882432.95630: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-0a3f-ff3c-000000000a3a] 12081 1726882432.95632: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3a 12081 1726882432.95748: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3a 12081 1726882432.95752: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882432.95812: no more pending results, returning what we have 12081 1726882432.95815: results queue empty 12081 1726882432.95816: checking for any_errors_fatal 12081 1726882432.95824: done checking for any_errors_fatal 12081 1726882432.95825: checking for max_fail_percentage 12081 1726882432.95827: done checking for max_fail_percentage 12081 1726882432.95828: checking to see if all hosts have failed and the running result is not ok 12081 1726882432.95829: done checking to see if all hosts have failed 12081 1726882432.95829: getting the remaining hosts for this loop 12081 1726882432.95831: done getting the remaining hosts for this loop 12081 1726882432.95834: getting the next task for host managed_node3 12081 1726882432.95841: done getting next task for host managed_node3 12081 1726882432.95848: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882432.95854: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882432.95868: getting variables 12081 1726882432.95870: in VariableManager get_vars() 12081 1726882432.95913: Calling all_inventory to load vars for managed_node3 12081 1726882432.95916: Calling groups_inventory to load vars for managed_node3 12081 1726882432.95918: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882432.95929: Calling all_plugins_play to load vars for managed_node3 12081 1726882432.95931: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882432.95934: Calling groups_plugins_play to load vars for managed_node3 12081 1726882433.03946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882433.04889: done with get_vars() 12081 1726882433.04910: done getting variables 12081 1726882433.04947: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:53 -0400 (0:00:00.979) 0:00:52.852 ****** 12081 1726882433.04982: entering _queue_task() for managed_node3/service 12081 1726882433.05287: worker is 1 (out of 1 available) 12081 1726882433.05300: exiting _queue_task() for managed_node3/service 12081 1726882433.05312: done queuing things up, now waiting for results queue to drain 12081 1726882433.05314: waiting for pending results... 12081 1726882433.05675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882433.05848: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a3b 12081 1726882433.05872: variable 'ansible_search_path' from source: unknown 12081 1726882433.05880: variable 'ansible_search_path' from source: unknown 12081 1726882433.05922: calling self._execute() 12081 1726882433.06027: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.06040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.06064: variable 'omit' from source: magic vars 12081 1726882433.06573: variable 'ansible_distribution_major_version' from source: facts 12081 1726882433.06585: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882433.06667: variable 'network_provider' from source: set_fact 12081 1726882433.06680: Evaluated conditional (network_provider == "nm"): True 12081 1726882433.06756: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882433.06818: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882433.06951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882433.08592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882433.08681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882433.08721: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882433.08769: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882433.08802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882433.08889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882433.08923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882433.08956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882433.09004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882433.09025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882433.09087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882433.09127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882433.09156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882433.09207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882433.09235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882433.09282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882433.09308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882433.09346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882433.09393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882433.09410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882433.09588: variable 'network_connections' from source: task vars 12081 1726882433.09607: variable 'controller_profile' from source: play vars 12081 1726882433.09695: variable 'controller_profile' from source: play vars 12081 1726882433.09712: variable 'controller_device' from source: play vars 12081 1726882433.09785: variable 'controller_device' from source: play vars 12081 1726882433.09800: variable 'dhcp_interface1' from source: play vars 12081 1726882433.09862: variable 'dhcp_interface1' from source: play vars 12081 1726882433.09885: variable 'port1_profile' from source: play vars 12081 1726882433.09946: variable 'port1_profile' from source: play vars 12081 1726882433.09958: variable 'dhcp_interface1' from source: play vars 12081 1726882433.10029: variable 'dhcp_interface1' from source: play vars 12081 1726882433.10041: variable 'controller_profile' from source: play vars 12081 1726882433.10136: variable 'controller_profile' from source: play vars 12081 1726882433.10161: variable 'port2_profile' from source: play vars 12081 1726882433.10289: variable 'port2_profile' from source: play vars 12081 1726882433.10301: variable 'dhcp_interface2' from source: play vars 12081 1726882433.10398: variable 'dhcp_interface2' from source: play vars 12081 1726882433.10403: variable 'controller_profile' from source: play vars 12081 1726882433.10448: variable 'controller_profile' from source: play vars 12081 1726882433.10521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882433.10671: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882433.10699: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882433.10721: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882433.10750: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882433.10782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882433.10798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882433.10820: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882433.10838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882433.10884: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882433.11046: variable 'network_connections' from source: task vars 12081 1726882433.11050: variable 'controller_profile' from source: play vars 12081 1726882433.11099: variable 'controller_profile' from source: play vars 12081 1726882433.11105: variable 'controller_device' from source: play vars 12081 1726882433.11148: variable 'controller_device' from source: play vars 12081 1726882433.11154: variable 'dhcp_interface1' from source: play vars 12081 1726882433.11201: variable 'dhcp_interface1' from source: play vars 12081 1726882433.11208: variable 'port1_profile' from source: play vars 12081 1726882433.11251: variable 'port1_profile' from source: play vars 12081 1726882433.11258: variable 'dhcp_interface1' from source: play vars 12081 1726882433.11303: variable 'dhcp_interface1' from source: play vars 12081 1726882433.11309: variable 'controller_profile' from source: play vars 12081 1726882433.11350: variable 'controller_profile' from source: play vars 12081 1726882433.11358: variable 'port2_profile' from source: play vars 12081 1726882433.11401: variable 'port2_profile' from source: play vars 12081 1726882433.11410: variable 'dhcp_interface2' from source: play vars 12081 1726882433.11451: variable 'dhcp_interface2' from source: play vars 12081 1726882433.11457: variable 'controller_profile' from source: play vars 12081 1726882433.11501: variable 'controller_profile' from source: play vars 12081 1726882433.11538: Evaluated conditional (__network_wpa_supplicant_required): False 12081 1726882433.11542: when evaluation is False, skipping this task 12081 1726882433.11545: _execute() done 12081 1726882433.11547: dumping result to json 12081 1726882433.11549: done dumping result, returning 12081 1726882433.11557: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-0a3f-ff3c-000000000a3b] 12081 1726882433.11565: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3b 12081 1726882433.11657: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3b 12081 1726882433.11661: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12081 1726882433.11708: no more pending results, returning what we have 12081 1726882433.11711: results queue empty 12081 1726882433.11712: checking for any_errors_fatal 12081 1726882433.11735: done checking for any_errors_fatal 12081 1726882433.11736: checking for max_fail_percentage 12081 1726882433.11738: done checking for max_fail_percentage 12081 1726882433.11738: checking to see if all hosts have failed and the running result is not ok 12081 1726882433.11740: done checking to see if all hosts have failed 12081 1726882433.11740: getting the remaining hosts for this loop 12081 1726882433.11742: done getting the remaining hosts for this loop 12081 1726882433.11746: getting the next task for host managed_node3 12081 1726882433.11754: done getting next task for host managed_node3 12081 1726882433.11758: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882433.11765: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882433.11785: getting variables 12081 1726882433.11787: in VariableManager get_vars() 12081 1726882433.11827: Calling all_inventory to load vars for managed_node3 12081 1726882433.11830: Calling groups_inventory to load vars for managed_node3 12081 1726882433.11832: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882433.11842: Calling all_plugins_play to load vars for managed_node3 12081 1726882433.11844: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882433.11847: Calling groups_plugins_play to load vars for managed_node3 12081 1726882433.12932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882433.14706: done with get_vars() 12081 1726882433.14744: done getting variables 12081 1726882433.14889: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:53 -0400 (0:00:00.099) 0:00:52.952 ****** 12081 1726882433.14939: entering _queue_task() for managed_node3/service 12081 1726882433.15270: worker is 1 (out of 1 available) 12081 1726882433.15284: exiting _queue_task() for managed_node3/service 12081 1726882433.15300: done queuing things up, now waiting for results queue to drain 12081 1726882433.15302: waiting for pending results... 12081 1726882433.15494: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882433.15615: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a3c 12081 1726882433.15626: variable 'ansible_search_path' from source: unknown 12081 1726882433.15631: variable 'ansible_search_path' from source: unknown 12081 1726882433.15666: calling self._execute() 12081 1726882433.15743: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.15747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.15763: variable 'omit' from source: magic vars 12081 1726882433.16045: variable 'ansible_distribution_major_version' from source: facts 12081 1726882433.16058: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882433.16142: variable 'network_provider' from source: set_fact 12081 1726882433.16146: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882433.16149: when evaluation is False, skipping this task 12081 1726882433.16151: _execute() done 12081 1726882433.16154: dumping result to json 12081 1726882433.16160: done dumping result, returning 12081 1726882433.16168: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-0a3f-ff3c-000000000a3c] 12081 1726882433.16175: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3c 12081 1726882433.16277: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3c 12081 1726882433.16279: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882433.16323: no more pending results, returning what we have 12081 1726882433.16327: results queue empty 12081 1726882433.16328: checking for any_errors_fatal 12081 1726882433.16335: done checking for any_errors_fatal 12081 1726882433.16335: checking for max_fail_percentage 12081 1726882433.16337: done checking for max_fail_percentage 12081 1726882433.16338: checking to see if all hosts have failed and the running result is not ok 12081 1726882433.16339: done checking to see if all hosts have failed 12081 1726882433.16340: getting the remaining hosts for this loop 12081 1726882433.16341: done getting the remaining hosts for this loop 12081 1726882433.16345: getting the next task for host managed_node3 12081 1726882433.16355: done getting next task for host managed_node3 12081 1726882433.16359: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882433.16367: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882433.16390: getting variables 12081 1726882433.16392: in VariableManager get_vars() 12081 1726882433.16433: Calling all_inventory to load vars for managed_node3 12081 1726882433.16435: Calling groups_inventory to load vars for managed_node3 12081 1726882433.16437: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882433.16449: Calling all_plugins_play to load vars for managed_node3 12081 1726882433.16452: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882433.16457: Calling groups_plugins_play to load vars for managed_node3 12081 1726882433.17649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882433.18878: done with get_vars() 12081 1726882433.18903: done getting variables 12081 1726882433.18948: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:53 -0400 (0:00:00.040) 0:00:52.992 ****** 12081 1726882433.18985: entering _queue_task() for managed_node3/copy 12081 1726882433.19240: worker is 1 (out of 1 available) 12081 1726882433.19257: exiting _queue_task() for managed_node3/copy 12081 1726882433.19272: done queuing things up, now waiting for results queue to drain 12081 1726882433.19274: waiting for pending results... 12081 1726882433.19469: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882433.19572: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a3d 12081 1726882433.19584: variable 'ansible_search_path' from source: unknown 12081 1726882433.19588: variable 'ansible_search_path' from source: unknown 12081 1726882433.19619: calling self._execute() 12081 1726882433.19700: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.19704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.19713: variable 'omit' from source: magic vars 12081 1726882433.20015: variable 'ansible_distribution_major_version' from source: facts 12081 1726882433.20025: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882433.20110: variable 'network_provider' from source: set_fact 12081 1726882433.20114: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882433.20116: when evaluation is False, skipping this task 12081 1726882433.20119: _execute() done 12081 1726882433.20122: dumping result to json 12081 1726882433.20125: done dumping result, returning 12081 1726882433.20134: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-0a3f-ff3c-000000000a3d] 12081 1726882433.20140: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3d 12081 1726882433.20236: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3d 12081 1726882433.20238: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12081 1726882433.20305: no more pending results, returning what we have 12081 1726882433.20309: results queue empty 12081 1726882433.20310: checking for any_errors_fatal 12081 1726882433.20317: done checking for any_errors_fatal 12081 1726882433.20318: checking for max_fail_percentage 12081 1726882433.20320: done checking for max_fail_percentage 12081 1726882433.20321: checking to see if all hosts have failed and the running result is not ok 12081 1726882433.20322: done checking to see if all hosts have failed 12081 1726882433.20323: getting the remaining hosts for this loop 12081 1726882433.20324: done getting the remaining hosts for this loop 12081 1726882433.20328: getting the next task for host managed_node3 12081 1726882433.20336: done getting next task for host managed_node3 12081 1726882433.20340: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882433.20345: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882433.20373: getting variables 12081 1726882433.20375: in VariableManager get_vars() 12081 1726882433.20411: Calling all_inventory to load vars for managed_node3 12081 1726882433.20414: Calling groups_inventory to load vars for managed_node3 12081 1726882433.20416: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882433.20425: Calling all_plugins_play to load vars for managed_node3 12081 1726882433.20427: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882433.20430: Calling groups_plugins_play to load vars for managed_node3 12081 1726882433.21416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882433.22358: done with get_vars() 12081 1726882433.22381: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:53 -0400 (0:00:00.034) 0:00:53.027 ****** 12081 1726882433.22451: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882433.22712: worker is 1 (out of 1 available) 12081 1726882433.22725: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882433.22739: done queuing things up, now waiting for results queue to drain 12081 1726882433.22740: waiting for pending results... 12081 1726882433.22939: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882433.23065: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a3e 12081 1726882433.23077: variable 'ansible_search_path' from source: unknown 12081 1726882433.23081: variable 'ansible_search_path' from source: unknown 12081 1726882433.23115: calling self._execute() 12081 1726882433.23201: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.23205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.23213: variable 'omit' from source: magic vars 12081 1726882433.23498: variable 'ansible_distribution_major_version' from source: facts 12081 1726882433.23510: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882433.23519: variable 'omit' from source: magic vars 12081 1726882433.23575: variable 'omit' from source: magic vars 12081 1726882433.23696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882433.25282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882433.25333: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882433.25363: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882433.25389: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882433.25408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882433.25471: variable 'network_provider' from source: set_fact 12081 1726882433.25574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882433.25595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882433.25613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882433.25638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882433.25650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882433.25711: variable 'omit' from source: magic vars 12081 1726882433.25788: variable 'omit' from source: magic vars 12081 1726882433.25868: variable 'network_connections' from source: task vars 12081 1726882433.25878: variable 'controller_profile' from source: play vars 12081 1726882433.25927: variable 'controller_profile' from source: play vars 12081 1726882433.25935: variable 'controller_device' from source: play vars 12081 1726882433.25981: variable 'controller_device' from source: play vars 12081 1726882433.25989: variable 'dhcp_interface1' from source: play vars 12081 1726882433.26036: variable 'dhcp_interface1' from source: play vars 12081 1726882433.26044: variable 'port1_profile' from source: play vars 12081 1726882433.26087: variable 'port1_profile' from source: play vars 12081 1726882433.26093: variable 'dhcp_interface1' from source: play vars 12081 1726882433.26140: variable 'dhcp_interface1' from source: play vars 12081 1726882433.26145: variable 'controller_profile' from source: play vars 12081 1726882433.26189: variable 'controller_profile' from source: play vars 12081 1726882433.26196: variable 'port2_profile' from source: play vars 12081 1726882433.26239: variable 'port2_profile' from source: play vars 12081 1726882433.26245: variable 'dhcp_interface2' from source: play vars 12081 1726882433.26290: variable 'dhcp_interface2' from source: play vars 12081 1726882433.26296: variable 'controller_profile' from source: play vars 12081 1726882433.26339: variable 'controller_profile' from source: play vars 12081 1726882433.26473: variable 'omit' from source: magic vars 12081 1726882433.26480: variable '__lsr_ansible_managed' from source: task vars 12081 1726882433.26522: variable '__lsr_ansible_managed' from source: task vars 12081 1726882433.26668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12081 1726882433.26815: Loaded config def from plugin (lookup/template) 12081 1726882433.26819: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12081 1726882433.26839: File lookup term: get_ansible_managed.j2 12081 1726882433.26842: variable 'ansible_search_path' from source: unknown 12081 1726882433.26846: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12081 1726882433.26859: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12081 1726882433.26873: variable 'ansible_search_path' from source: unknown 12081 1726882433.30581: variable 'ansible_managed' from source: unknown 12081 1726882433.30672: variable 'omit' from source: magic vars 12081 1726882433.30693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882433.30712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882433.30726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882433.30739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882433.30747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882433.30771: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882433.30778: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.30781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.30843: Set connection var ansible_pipelining to False 12081 1726882433.30846: Set connection var ansible_shell_type to sh 12081 1726882433.30851: Set connection var ansible_shell_executable to /bin/sh 12081 1726882433.30866: Set connection var ansible_connection to ssh 12081 1726882433.30869: Set connection var ansible_timeout to 10 12081 1726882433.30872: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882433.30888: variable 'ansible_shell_executable' from source: unknown 12081 1726882433.30891: variable 'ansible_connection' from source: unknown 12081 1726882433.30898: variable 'ansible_module_compression' from source: unknown 12081 1726882433.30900: variable 'ansible_shell_type' from source: unknown 12081 1726882433.30903: variable 'ansible_shell_executable' from source: unknown 12081 1726882433.30905: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.30907: variable 'ansible_pipelining' from source: unknown 12081 1726882433.30909: variable 'ansible_timeout' from source: unknown 12081 1726882433.30911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.30999: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882433.31007: variable 'omit' from source: magic vars 12081 1726882433.31013: starting attempt loop 12081 1726882433.31015: running the handler 12081 1726882433.31028: _low_level_execute_command(): starting 12081 1726882433.31034: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882433.31547: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.31560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.31590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.31602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.31658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882433.31671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882433.31791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882433.33483: stdout chunk (state=3): >>>/root <<< 12081 1726882433.33583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882433.33635: stderr chunk (state=3): >>><<< 12081 1726882433.33638: stdout chunk (state=3): >>><<< 12081 1726882433.33660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882433.33672: _low_level_execute_command(): starting 12081 1726882433.33680: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457 `" && echo ansible-tmp-1726882433.3366-14476-257176903653457="` echo /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457 `" ) && sleep 0' 12081 1726882433.34153: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.34163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.34186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882433.34198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882433.34207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.34248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882433.34259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882433.34274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882433.34387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882433.36292: stdout chunk (state=3): >>>ansible-tmp-1726882433.3366-14476-257176903653457=/root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457 <<< 12081 1726882433.36390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882433.36444: stderr chunk (state=3): >>><<< 12081 1726882433.36447: stdout chunk (state=3): >>><<< 12081 1726882433.36466: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882433.3366-14476-257176903653457=/root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882433.36507: variable 'ansible_module_compression' from source: unknown 12081 1726882433.36548: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12081 1726882433.36575: variable 'ansible_facts' from source: unknown 12081 1726882433.36640: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457/AnsiballZ_network_connections.py 12081 1726882433.36749: Sending initial data 12081 1726882433.36757: Sent initial data (165 bytes) 12081 1726882433.37420: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882433.37426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.37437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.37482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.37486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.37488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.37534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882433.37545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882433.37656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882433.39397: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882433.39492: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882433.39592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmphm89hveo /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457/AnsiballZ_network_connections.py <<< 12081 1726882433.39688: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882433.41589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882433.41795: stderr chunk (state=3): >>><<< 12081 1726882433.41799: stdout chunk (state=3): >>><<< 12081 1726882433.41801: done transferring module to remote 12081 1726882433.41803: _low_level_execute_command(): starting 12081 1726882433.41806: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457/ /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457/AnsiballZ_network_connections.py && sleep 0' 12081 1726882433.42415: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882433.42431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882433.42446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.42470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.42514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882433.42528: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882433.42542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.42564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882433.42579: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882433.42591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882433.42603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882433.42617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.42633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.42646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882433.42661: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882433.42679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.42756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882433.42781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882433.42797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882433.42932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882433.44785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882433.44802: stderr chunk (state=3): >>><<< 12081 1726882433.44806: stdout chunk (state=3): >>><<< 12081 1726882433.44826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882433.44829: _low_level_execute_command(): starting 12081 1726882433.44834: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457/AnsiballZ_network_connections.py && sleep 0' 12081 1726882433.45508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882433.45517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882433.45527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.45541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.45589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882433.45595: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882433.45605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.45618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882433.45625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882433.45631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882433.45639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882433.45647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.45658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.45667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882433.45679: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882433.45692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.45765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882433.45784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882433.45801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882433.45936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882433.83811: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12081 1726882433.85783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882433.85787: stdout chunk (state=3): >>><<< 12081 1726882433.85790: stderr chunk (state=3): >>><<< 12081 1726882433.85965: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882433.85970: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'arp_interval': 60, 'arp_ip_target': '192.0.2.128', 'arp_validate': 'none', 'primary': 'test1'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882433.85978: _low_level_execute_command(): starting 12081 1726882433.85981: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882433.3366-14476-257176903653457/ > /dev/null 2>&1 && sleep 0' 12081 1726882433.87492: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882433.87497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882433.87519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882433.87524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882433.87597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882433.87614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882433.87738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882433.89593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882433.89685: stderr chunk (state=3): >>><<< 12081 1726882433.89689: stdout chunk (state=3): >>><<< 12081 1726882433.89770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882433.89773: handler run complete 12081 1726882433.89776: attempt loop complete, returning result 12081 1726882433.89778: _execute() done 12081 1726882433.89780: dumping result to json 12081 1726882433.89782: done dumping result, returning 12081 1726882433.89883: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-0a3f-ff3c-000000000a3e] 12081 1726882433.89887: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3e 12081 1726882433.90008: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3e 12081 1726882433.90012: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active) 12081 1726882433.90149: no more pending results, returning what we have 12081 1726882433.90155: results queue empty 12081 1726882433.90156: checking for any_errors_fatal 12081 1726882433.90165: done checking for any_errors_fatal 12081 1726882433.90165: checking for max_fail_percentage 12081 1726882433.90167: done checking for max_fail_percentage 12081 1726882433.90168: checking to see if all hosts have failed and the running result is not ok 12081 1726882433.90169: done checking to see if all hosts have failed 12081 1726882433.90170: getting the remaining hosts for this loop 12081 1726882433.90171: done getting the remaining hosts for this loop 12081 1726882433.90175: getting the next task for host managed_node3 12081 1726882433.90182: done getting next task for host managed_node3 12081 1726882433.90186: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882433.90191: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882433.90202: getting variables 12081 1726882433.90207: in VariableManager get_vars() 12081 1726882433.90248: Calling all_inventory to load vars for managed_node3 12081 1726882433.90251: Calling groups_inventory to load vars for managed_node3 12081 1726882433.90255: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882433.90268: Calling all_plugins_play to load vars for managed_node3 12081 1726882433.90270: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882433.90273: Calling groups_plugins_play to load vars for managed_node3 12081 1726882433.91223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882433.92177: done with get_vars() 12081 1726882433.92195: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:53 -0400 (0:00:00.698) 0:00:53.725 ****** 12081 1726882433.92263: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882433.92500: worker is 1 (out of 1 available) 12081 1726882433.92527: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882433.92540: done queuing things up, now waiting for results queue to drain 12081 1726882433.92541: waiting for pending results... 12081 1726882433.92801: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882433.92995: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a3f 12081 1726882433.93021: variable 'ansible_search_path' from source: unknown 12081 1726882433.93030: variable 'ansible_search_path' from source: unknown 12081 1726882433.93077: calling self._execute() 12081 1726882433.93191: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.93211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.93230: variable 'omit' from source: magic vars 12081 1726882433.93638: variable 'ansible_distribution_major_version' from source: facts 12081 1726882433.93667: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882433.93811: variable 'network_state' from source: role '' defaults 12081 1726882433.93826: Evaluated conditional (network_state != {}): False 12081 1726882433.93833: when evaluation is False, skipping this task 12081 1726882433.93839: _execute() done 12081 1726882433.93845: dumping result to json 12081 1726882433.93857: done dumping result, returning 12081 1726882433.93873: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-0a3f-ff3c-000000000a3f] 12081 1726882433.93889: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882433.94048: no more pending results, returning what we have 12081 1726882433.94052: results queue empty 12081 1726882433.94053: checking for any_errors_fatal 12081 1726882433.94073: done checking for any_errors_fatal 12081 1726882433.94074: checking for max_fail_percentage 12081 1726882433.94076: done checking for max_fail_percentage 12081 1726882433.94077: checking to see if all hosts have failed and the running result is not ok 12081 1726882433.94079: done checking to see if all hosts have failed 12081 1726882433.94079: getting the remaining hosts for this loop 12081 1726882433.94081: done getting the remaining hosts for this loop 12081 1726882433.94085: getting the next task for host managed_node3 12081 1726882433.94094: done getting next task for host managed_node3 12081 1726882433.94098: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882433.94106: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882433.94130: getting variables 12081 1726882433.94131: in VariableManager get_vars() 12081 1726882433.94299: Calling all_inventory to load vars for managed_node3 12081 1726882433.94302: Calling groups_inventory to load vars for managed_node3 12081 1726882433.94305: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882433.94321: Calling all_plugins_play to load vars for managed_node3 12081 1726882433.94325: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882433.94328: Calling groups_plugins_play to load vars for managed_node3 12081 1726882433.95111: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a3f 12081 1726882433.95114: WORKER PROCESS EXITING 12081 1726882433.95608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882433.97987: done with get_vars() 12081 1726882433.98022: done getting variables 12081 1726882433.98070: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:53 -0400 (0:00:00.058) 0:00:53.783 ****** 12081 1726882433.98099: entering _queue_task() for managed_node3/debug 12081 1726882433.98449: worker is 1 (out of 1 available) 12081 1726882433.98465: exiting _queue_task() for managed_node3/debug 12081 1726882433.98478: done queuing things up, now waiting for results queue to drain 12081 1726882433.98480: waiting for pending results... 12081 1726882433.98786: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882433.98958: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a40 12081 1726882433.98982: variable 'ansible_search_path' from source: unknown 12081 1726882433.98990: variable 'ansible_search_path' from source: unknown 12081 1726882433.99038: calling self._execute() 12081 1726882433.99268: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882433.99547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882433.99571: variable 'omit' from source: magic vars 12081 1726882434.00021: variable 'ansible_distribution_major_version' from source: facts 12081 1726882434.00039: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882434.00050: variable 'omit' from source: magic vars 12081 1726882434.00139: variable 'omit' from source: magic vars 12081 1726882434.00182: variable 'omit' from source: magic vars 12081 1726882434.00233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882434.00278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882434.00303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882434.00329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.00344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.00382: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882434.00389: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.00396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.00506: Set connection var ansible_pipelining to False 12081 1726882434.00513: Set connection var ansible_shell_type to sh 12081 1726882434.00525: Set connection var ansible_shell_executable to /bin/sh 12081 1726882434.00535: Set connection var ansible_connection to ssh 12081 1726882434.00544: Set connection var ansible_timeout to 10 12081 1726882434.00555: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882434.00586: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.00594: variable 'ansible_connection' from source: unknown 12081 1726882434.00600: variable 'ansible_module_compression' from source: unknown 12081 1726882434.00606: variable 'ansible_shell_type' from source: unknown 12081 1726882434.00612: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.00617: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.00624: variable 'ansible_pipelining' from source: unknown 12081 1726882434.00629: variable 'ansible_timeout' from source: unknown 12081 1726882434.00641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.00790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882434.00806: variable 'omit' from source: magic vars 12081 1726882434.00815: starting attempt loop 12081 1726882434.00820: running the handler 12081 1726882434.00975: variable '__network_connections_result' from source: set_fact 12081 1726882434.01049: handler run complete 12081 1726882434.01084: attempt loop complete, returning result 12081 1726882434.01092: _execute() done 12081 1726882434.01098: dumping result to json 12081 1726882434.01105: done dumping result, returning 12081 1726882434.01117: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-000000000a40] 12081 1726882434.01128: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a40 12081 1726882434.01248: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a40 12081 1726882434.01259: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active)" ] } 12081 1726882434.01486: no more pending results, returning what we have 12081 1726882434.01490: results queue empty 12081 1726882434.01490: checking for any_errors_fatal 12081 1726882434.01500: done checking for any_errors_fatal 12081 1726882434.01501: checking for max_fail_percentage 12081 1726882434.01503: done checking for max_fail_percentage 12081 1726882434.01504: checking to see if all hosts have failed and the running result is not ok 12081 1726882434.01505: done checking to see if all hosts have failed 12081 1726882434.01505: getting the remaining hosts for this loop 12081 1726882434.01507: done getting the remaining hosts for this loop 12081 1726882434.01511: getting the next task for host managed_node3 12081 1726882434.01519: done getting next task for host managed_node3 12081 1726882434.01523: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882434.01529: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882434.01540: getting variables 12081 1726882434.01542: in VariableManager get_vars() 12081 1726882434.01589: Calling all_inventory to load vars for managed_node3 12081 1726882434.01592: Calling groups_inventory to load vars for managed_node3 12081 1726882434.01595: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.01607: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.01618: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.01621: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.03872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.05875: done with get_vars() 12081 1726882434.05908: done getting variables 12081 1726882434.05983: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:54 -0400 (0:00:00.079) 0:00:53.863 ****** 12081 1726882434.06022: entering _queue_task() for managed_node3/debug 12081 1726882434.06403: worker is 1 (out of 1 available) 12081 1726882434.06416: exiting _queue_task() for managed_node3/debug 12081 1726882434.06430: done queuing things up, now waiting for results queue to drain 12081 1726882434.06431: waiting for pending results... 12081 1726882434.06752: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882434.06940: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a41 12081 1726882434.06966: variable 'ansible_search_path' from source: unknown 12081 1726882434.06977: variable 'ansible_search_path' from source: unknown 12081 1726882434.07023: calling self._execute() 12081 1726882434.07129: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.07144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.07151: variable 'omit' from source: magic vars 12081 1726882434.07454: variable 'ansible_distribution_major_version' from source: facts 12081 1726882434.07470: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882434.07477: variable 'omit' from source: magic vars 12081 1726882434.07525: variable 'omit' from source: magic vars 12081 1726882434.07547: variable 'omit' from source: magic vars 12081 1726882434.07585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882434.07613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882434.07630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882434.07643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.07653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.07683: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882434.07686: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.07688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.07763: Set connection var ansible_pipelining to False 12081 1726882434.07768: Set connection var ansible_shell_type to sh 12081 1726882434.07773: Set connection var ansible_shell_executable to /bin/sh 12081 1726882434.07776: Set connection var ansible_connection to ssh 12081 1726882434.07781: Set connection var ansible_timeout to 10 12081 1726882434.07785: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882434.07806: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.07809: variable 'ansible_connection' from source: unknown 12081 1726882434.07812: variable 'ansible_module_compression' from source: unknown 12081 1726882434.07814: variable 'ansible_shell_type' from source: unknown 12081 1726882434.07817: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.07819: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.07821: variable 'ansible_pipelining' from source: unknown 12081 1726882434.07823: variable 'ansible_timeout' from source: unknown 12081 1726882434.07825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.07927: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882434.07936: variable 'omit' from source: magic vars 12081 1726882434.07940: starting attempt loop 12081 1726882434.07948: running the handler 12081 1726882434.07991: variable '__network_connections_result' from source: set_fact 12081 1726882434.08047: variable '__network_connections_result' from source: set_fact 12081 1726882434.08182: handler run complete 12081 1726882434.08203: attempt loop complete, returning result 12081 1726882434.08206: _execute() done 12081 1726882434.08208: dumping result to json 12081 1726882434.08213: done dumping result, returning 12081 1726882434.08220: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-000000000a41] 12081 1726882434.08227: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a41 12081 1726882434.08329: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a41 12081 1726882434.08332: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active)" ] } } 12081 1726882434.08450: no more pending results, returning what we have 12081 1726882434.08454: results queue empty 12081 1726882434.08455: checking for any_errors_fatal 12081 1726882434.08461: done checking for any_errors_fatal 12081 1726882434.08462: checking for max_fail_percentage 12081 1726882434.08465: done checking for max_fail_percentage 12081 1726882434.08466: checking to see if all hosts have failed and the running result is not ok 12081 1726882434.08467: done checking to see if all hosts have failed 12081 1726882434.08467: getting the remaining hosts for this loop 12081 1726882434.08469: done getting the remaining hosts for this loop 12081 1726882434.08472: getting the next task for host managed_node3 12081 1726882434.08478: done getting next task for host managed_node3 12081 1726882434.08481: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882434.08486: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882434.08496: getting variables 12081 1726882434.08497: in VariableManager get_vars() 12081 1726882434.08533: Calling all_inventory to load vars for managed_node3 12081 1726882434.08535: Calling groups_inventory to load vars for managed_node3 12081 1726882434.08537: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.08545: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.08547: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.08550: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.10039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.11029: done with get_vars() 12081 1726882434.11048: done getting variables 12081 1726882434.11097: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:54 -0400 (0:00:00.051) 0:00:53.914 ****** 12081 1726882434.11124: entering _queue_task() for managed_node3/debug 12081 1726882434.11373: worker is 1 (out of 1 available) 12081 1726882434.11386: exiting _queue_task() for managed_node3/debug 12081 1726882434.11400: done queuing things up, now waiting for results queue to drain 12081 1726882434.11402: waiting for pending results... 12081 1726882434.11595: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882434.11697: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a42 12081 1726882434.11710: variable 'ansible_search_path' from source: unknown 12081 1726882434.11714: variable 'ansible_search_path' from source: unknown 12081 1726882434.11744: calling self._execute() 12081 1726882434.11822: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.11827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.11833: variable 'omit' from source: magic vars 12081 1726882434.12200: variable 'ansible_distribution_major_version' from source: facts 12081 1726882434.12219: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882434.12342: variable 'network_state' from source: role '' defaults 12081 1726882434.12360: Evaluated conditional (network_state != {}): False 12081 1726882434.12371: when evaluation is False, skipping this task 12081 1726882434.12378: _execute() done 12081 1726882434.12385: dumping result to json 12081 1726882434.12392: done dumping result, returning 12081 1726882434.12403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-0a3f-ff3c-000000000a42] 12081 1726882434.12414: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a42 12081 1726882434.12532: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a42 12081 1726882434.12540: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12081 1726882434.12831: no more pending results, returning what we have 12081 1726882434.12835: results queue empty 12081 1726882434.12836: checking for any_errors_fatal 12081 1726882434.12843: done checking for any_errors_fatal 12081 1726882434.12844: checking for max_fail_percentage 12081 1726882434.12846: done checking for max_fail_percentage 12081 1726882434.12847: checking to see if all hosts have failed and the running result is not ok 12081 1726882434.12848: done checking to see if all hosts have failed 12081 1726882434.12848: getting the remaining hosts for this loop 12081 1726882434.12850: done getting the remaining hosts for this loop 12081 1726882434.12853: getting the next task for host managed_node3 12081 1726882434.12859: done getting next task for host managed_node3 12081 1726882434.12867: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882434.12873: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882434.12893: getting variables 12081 1726882434.12895: in VariableManager get_vars() 12081 1726882434.12940: Calling all_inventory to load vars for managed_node3 12081 1726882434.12942: Calling groups_inventory to load vars for managed_node3 12081 1726882434.12945: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.12954: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.12957: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.12960: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.14682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.16555: done with get_vars() 12081 1726882434.16589: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:54 -0400 (0:00:00.055) 0:00:53.969 ****** 12081 1726882434.16705: entering _queue_task() for managed_node3/ping 12081 1726882434.17086: worker is 1 (out of 1 available) 12081 1726882434.17099: exiting _queue_task() for managed_node3/ping 12081 1726882434.17112: done queuing things up, now waiting for results queue to drain 12081 1726882434.17113: waiting for pending results... 12081 1726882434.17551: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882434.17632: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a43 12081 1726882434.17645: variable 'ansible_search_path' from source: unknown 12081 1726882434.17649: variable 'ansible_search_path' from source: unknown 12081 1726882434.17692: calling self._execute() 12081 1726882434.17794: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.17800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.17810: variable 'omit' from source: magic vars 12081 1726882434.18243: variable 'ansible_distribution_major_version' from source: facts 12081 1726882434.18262: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882434.18271: variable 'omit' from source: magic vars 12081 1726882434.18344: variable 'omit' from source: magic vars 12081 1726882434.18382: variable 'omit' from source: magic vars 12081 1726882434.18422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882434.18459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882434.18487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882434.18505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.18516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.18550: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882434.18555: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.18558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.18670: Set connection var ansible_pipelining to False 12081 1726882434.18673: Set connection var ansible_shell_type to sh 12081 1726882434.18682: Set connection var ansible_shell_executable to /bin/sh 12081 1726882434.18684: Set connection var ansible_connection to ssh 12081 1726882434.18694: Set connection var ansible_timeout to 10 12081 1726882434.18703: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882434.18729: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.18732: variable 'ansible_connection' from source: unknown 12081 1726882434.18735: variable 'ansible_module_compression' from source: unknown 12081 1726882434.18738: variable 'ansible_shell_type' from source: unknown 12081 1726882434.18740: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.18743: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.18745: variable 'ansible_pipelining' from source: unknown 12081 1726882434.18747: variable 'ansible_timeout' from source: unknown 12081 1726882434.18749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.18976: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882434.18985: variable 'omit' from source: magic vars 12081 1726882434.18991: starting attempt loop 12081 1726882434.18994: running the handler 12081 1726882434.19007: _low_level_execute_command(): starting 12081 1726882434.19020: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882434.19831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.19843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.19859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.19878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.19924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.19933: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.19943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.19958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.19978: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.19983: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.19992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.20002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.20021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.20028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.20037: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.20047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.20132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.20146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.20150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.20293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.22271: stdout chunk (state=3): >>>/root <<< 12081 1726882434.23039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.23129: stderr chunk (state=3): >>><<< 12081 1726882434.23132: stdout chunk (state=3): >>><<< 12081 1726882434.23159: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882434.23174: _low_level_execute_command(): starting 12081 1726882434.23182: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882 `" && echo ansible-tmp-1726882434.2315726-14517-109249927686882="` echo /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882 `" ) && sleep 0' 12081 1726882434.24930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.24935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.25015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.25019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.25104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.25111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.25233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.25238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.25259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.25428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.27504: stdout chunk (state=3): >>>ansible-tmp-1726882434.2315726-14517-109249927686882=/root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882 <<< 12081 1726882434.27683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.27687: stderr chunk (state=3): >>><<< 12081 1726882434.27690: stdout chunk (state=3): >>><<< 12081 1726882434.27709: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882434.2315726-14517-109249927686882=/root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882434.27756: variable 'ansible_module_compression' from source: unknown 12081 1726882434.27796: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12081 1726882434.27835: variable 'ansible_facts' from source: unknown 12081 1726882434.27902: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882/AnsiballZ_ping.py 12081 1726882434.28699: Sending initial data 12081 1726882434.28703: Sent initial data (153 bytes) 12081 1726882434.31710: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.31718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.31729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.31747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.31994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.32001: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.32011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.32024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.32032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.32038: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.32046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.32058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.32070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.32077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.32084: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.32093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.32166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.32186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.32196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.32321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.34100: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882434.34195: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882434.34298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpo5skgnw_ /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882/AnsiballZ_ping.py <<< 12081 1726882434.34394: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882434.35779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.35932: stderr chunk (state=3): >>><<< 12081 1726882434.35935: stdout chunk (state=3): >>><<< 12081 1726882434.35958: done transferring module to remote 12081 1726882434.35970: _low_level_execute_command(): starting 12081 1726882434.35975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882/ /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882/AnsiballZ_ping.py && sleep 0' 12081 1726882434.37019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.37033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.37046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.37076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.37131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.37145: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.37162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.37182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.37193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.37209: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.37226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.37240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.37258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.37273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.37283: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.37295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.37375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.37392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.37406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.37552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.39312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.39390: stderr chunk (state=3): >>><<< 12081 1726882434.39404: stdout chunk (state=3): >>><<< 12081 1726882434.39469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882434.39472: _low_level_execute_command(): starting 12081 1726882434.39475: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882/AnsiballZ_ping.py && sleep 0' 12081 1726882434.40203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.40217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.40239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.40261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.40306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.40318: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.40330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.40359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.40376: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.40387: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.40399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.40413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.40430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.40442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.40459: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.40481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.40557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.40588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.40605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.40807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.54009: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12081 1726882434.55182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882434.55186: stdout chunk (state=3): >>><<< 12081 1726882434.55188: stderr chunk (state=3): >>><<< 12081 1726882434.55190: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882434.55194: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882434.55200: _low_level_execute_command(): starting 12081 1726882434.55202: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882434.2315726-14517-109249927686882/ > /dev/null 2>&1 && sleep 0' 12081 1726882434.55663: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.55671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.55709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.55713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.55719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.55726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.55733: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.55794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.55797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.55903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.57709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.57758: stderr chunk (state=3): >>><<< 12081 1726882434.57761: stdout chunk (state=3): >>><<< 12081 1726882434.57776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882434.57782: handler run complete 12081 1726882434.57795: attempt loop complete, returning result 12081 1726882434.57798: _execute() done 12081 1726882434.57800: dumping result to json 12081 1726882434.57802: done dumping result, returning 12081 1726882434.57811: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-0a3f-ff3c-000000000a43] 12081 1726882434.57817: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a43 12081 1726882434.57908: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a43 12081 1726882434.57911: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 12081 1726882434.57974: no more pending results, returning what we have 12081 1726882434.57978: results queue empty 12081 1726882434.57979: checking for any_errors_fatal 12081 1726882434.57988: done checking for any_errors_fatal 12081 1726882434.57988: checking for max_fail_percentage 12081 1726882434.57990: done checking for max_fail_percentage 12081 1726882434.57991: checking to see if all hosts have failed and the running result is not ok 12081 1726882434.57992: done checking to see if all hosts have failed 12081 1726882434.57993: getting the remaining hosts for this loop 12081 1726882434.57994: done getting the remaining hosts for this loop 12081 1726882434.57998: getting the next task for host managed_node3 12081 1726882434.58008: done getting next task for host managed_node3 12081 1726882434.58010: ^ task is: TASK: meta (role_complete) 12081 1726882434.58015: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882434.58026: getting variables 12081 1726882434.58028: in VariableManager get_vars() 12081 1726882434.58073: Calling all_inventory to load vars for managed_node3 12081 1726882434.58076: Calling groups_inventory to load vars for managed_node3 12081 1726882434.58078: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.58087: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.58089: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.58092: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.58941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.59895: done with get_vars() 12081 1726882434.59912: done getting variables 12081 1726882434.59974: done queuing things up, now waiting for results queue to drain 12081 1726882434.59975: results queue empty 12081 1726882434.59976: checking for any_errors_fatal 12081 1726882434.59978: done checking for any_errors_fatal 12081 1726882434.59978: checking for max_fail_percentage 12081 1726882434.59979: done checking for max_fail_percentage 12081 1726882434.59979: checking to see if all hosts have failed and the running result is not ok 12081 1726882434.59980: done checking to see if all hosts have failed 12081 1726882434.59980: getting the remaining hosts for this loop 12081 1726882434.59981: done getting the remaining hosts for this loop 12081 1726882434.59982: getting the next task for host managed_node3 12081 1726882434.59985: done getting next task for host managed_node3 12081 1726882434.59987: ^ task is: TASK: Show result 12081 1726882434.59988: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882434.59995: getting variables 12081 1726882434.59996: in VariableManager get_vars() 12081 1726882434.60007: Calling all_inventory to load vars for managed_node3 12081 1726882434.60009: Calling groups_inventory to load vars for managed_node3 12081 1726882434.60010: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.60013: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.60015: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.60016: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.60787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.61729: done with get_vars() 12081 1726882434.61746: done getting variables 12081 1726882434.61784: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml:33 Friday 20 September 2024 21:33:54 -0400 (0:00:00.451) 0:00:54.421 ****** 12081 1726882434.61805: entering _queue_task() for managed_node3/debug 12081 1726882434.62109: worker is 1 (out of 1 available) 12081 1726882434.62121: exiting _queue_task() for managed_node3/debug 12081 1726882434.62135: done queuing things up, now waiting for results queue to drain 12081 1726882434.62137: waiting for pending results... 12081 1726882434.62327: running TaskExecutor() for managed_node3/TASK: Show result 12081 1726882434.62400: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000a73 12081 1726882434.62411: variable 'ansible_search_path' from source: unknown 12081 1726882434.62415: variable 'ansible_search_path' from source: unknown 12081 1726882434.62446: calling self._execute() 12081 1726882434.62525: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.62530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.62537: variable 'omit' from source: magic vars 12081 1726882434.62822: variable 'ansible_distribution_major_version' from source: facts 12081 1726882434.62834: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882434.62840: variable 'omit' from source: magic vars 12081 1726882434.62857: variable 'omit' from source: magic vars 12081 1726882434.62880: variable 'omit' from source: magic vars 12081 1726882434.62918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882434.62943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882434.62960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882434.62975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.62985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.63009: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882434.63013: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.63015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.63087: Set connection var ansible_pipelining to False 12081 1726882434.63090: Set connection var ansible_shell_type to sh 12081 1726882434.63095: Set connection var ansible_shell_executable to /bin/sh 12081 1726882434.63098: Set connection var ansible_connection to ssh 12081 1726882434.63103: Set connection var ansible_timeout to 10 12081 1726882434.63108: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882434.63129: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.63132: variable 'ansible_connection' from source: unknown 12081 1726882434.63135: variable 'ansible_module_compression' from source: unknown 12081 1726882434.63138: variable 'ansible_shell_type' from source: unknown 12081 1726882434.63142: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.63144: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.63147: variable 'ansible_pipelining' from source: unknown 12081 1726882434.63149: variable 'ansible_timeout' from source: unknown 12081 1726882434.63151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.63245: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882434.63256: variable 'omit' from source: magic vars 12081 1726882434.63259: starting attempt loop 12081 1726882434.63263: running the handler 12081 1726882434.63301: variable '__network_connections_result' from source: set_fact 12081 1726882434.63358: variable '__network_connections_result' from source: set_fact 12081 1726882434.63484: handler run complete 12081 1726882434.63505: attempt loop complete, returning result 12081 1726882434.63508: _execute() done 12081 1726882434.63511: dumping result to json 12081 1726882434.63515: done dumping result, returning 12081 1726882434.63522: done running TaskExecutor() for managed_node3/TASK: Show result [0e448fcc-3ce9-0a3f-ff3c-000000000a73] 12081 1726882434.63529: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a73 12081 1726882434.63631: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000a73 12081 1726882434.63634: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2229dee4-9920-46b8-8e08-f14f29160e64 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e88d1421-07ed-4ef2-9acc-b9d849acc275 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, eeef3ef8-a51c-440e-8607-8eb479082482 (not-active)" ] } } 12081 1726882434.63732: no more pending results, returning what we have 12081 1726882434.63736: results queue empty 12081 1726882434.63742: checking for any_errors_fatal 12081 1726882434.63744: done checking for any_errors_fatal 12081 1726882434.63745: checking for max_fail_percentage 12081 1726882434.63747: done checking for max_fail_percentage 12081 1726882434.63747: checking to see if all hosts have failed and the running result is not ok 12081 1726882434.63748: done checking to see if all hosts have failed 12081 1726882434.63749: getting the remaining hosts for this loop 12081 1726882434.63750: done getting the remaining hosts for this loop 12081 1726882434.63756: getting the next task for host managed_node3 12081 1726882434.63762: done getting next task for host managed_node3 12081 1726882434.63767: ^ task is: TASK: Asserts 12081 1726882434.63769: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882434.63773: getting variables 12081 1726882434.63775: in VariableManager get_vars() 12081 1726882434.63816: Calling all_inventory to load vars for managed_node3 12081 1726882434.63819: Calling groups_inventory to load vars for managed_node3 12081 1726882434.63821: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.63831: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.63833: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.63836: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.64765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.65705: done with get_vars() 12081 1726882434.65720: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:33:54 -0400 (0:00:00.039) 0:00:54.460 ****** 12081 1726882434.65791: entering _queue_task() for managed_node3/include_tasks 12081 1726882434.66016: worker is 1 (out of 1 available) 12081 1726882434.66029: exiting _queue_task() for managed_node3/include_tasks 12081 1726882434.66043: done queuing things up, now waiting for results queue to drain 12081 1726882434.66044: waiting for pending results... 12081 1726882434.66222: running TaskExecutor() for managed_node3/TASK: Asserts 12081 1726882434.66293: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008ef 12081 1726882434.66305: variable 'ansible_search_path' from source: unknown 12081 1726882434.66309: variable 'ansible_search_path' from source: unknown 12081 1726882434.66345: variable 'lsr_assert' from source: include params 12081 1726882434.66501: variable 'lsr_assert' from source: include params 12081 1726882434.66557: variable 'omit' from source: magic vars 12081 1726882434.66655: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.66660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.66672: variable 'omit' from source: magic vars 12081 1726882434.66831: variable 'ansible_distribution_major_version' from source: facts 12081 1726882434.66843: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882434.66847: variable 'item' from source: unknown 12081 1726882434.66896: variable 'item' from source: unknown 12081 1726882434.66920: variable 'item' from source: unknown 12081 1726882434.66964: variable 'item' from source: unknown 12081 1726882434.67094: dumping result to json 12081 1726882434.67097: done dumping result, returning 12081 1726882434.67099: done running TaskExecutor() for managed_node3/TASK: Asserts [0e448fcc-3ce9-0a3f-ff3c-0000000008ef] 12081 1726882434.67100: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ef 12081 1726882434.67134: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008ef 12081 1726882434.67137: WORKER PROCESS EXITING 12081 1726882434.67166: no more pending results, returning what we have 12081 1726882434.67171: in VariableManager get_vars() 12081 1726882434.67208: Calling all_inventory to load vars for managed_node3 12081 1726882434.67210: Calling groups_inventory to load vars for managed_node3 12081 1726882434.67213: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.67222: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.67224: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.67227: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.68041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.68970: done with get_vars() 12081 1726882434.68985: variable 'ansible_search_path' from source: unknown 12081 1726882434.68986: variable 'ansible_search_path' from source: unknown 12081 1726882434.69013: we have included files to process 12081 1726882434.69014: generating all_blocks data 12081 1726882434.69016: done generating all_blocks data 12081 1726882434.69019: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12081 1726882434.69020: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12081 1726882434.69021: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12081 1726882434.69190: in VariableManager get_vars() 12081 1726882434.69209: done with get_vars() 12081 1726882434.69246: in VariableManager get_vars() 12081 1726882434.69276: done with get_vars() 12081 1726882434.69289: done processing included file 12081 1726882434.69291: iterating over new_blocks loaded from include file 12081 1726882434.69292: in VariableManager get_vars() 12081 1726882434.69308: done with get_vars() 12081 1726882434.69309: filtering new block on tags 12081 1726882434.69347: done filtering new block on tags 12081 1726882434.69349: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node3 => (item=tasks/assert_bond_options.yml) 12081 1726882434.69356: extending task lists for all hosts with included blocks 12081 1726882434.71783: done extending task lists 12081 1726882434.71784: done processing included files 12081 1726882434.71785: results queue empty 12081 1726882434.71785: checking for any_errors_fatal 12081 1726882434.71790: done checking for any_errors_fatal 12081 1726882434.71791: checking for max_fail_percentage 12081 1726882434.71792: done checking for max_fail_percentage 12081 1726882434.71792: checking to see if all hosts have failed and the running result is not ok 12081 1726882434.71793: done checking to see if all hosts have failed 12081 1726882434.71793: getting the remaining hosts for this loop 12081 1726882434.71794: done getting the remaining hosts for this loop 12081 1726882434.71796: getting the next task for host managed_node3 12081 1726882434.71799: done getting next task for host managed_node3 12081 1726882434.71800: ^ task is: TASK: ** TEST check bond settings 12081 1726882434.71802: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882434.71804: getting variables 12081 1726882434.71805: in VariableManager get_vars() 12081 1726882434.71816: Calling all_inventory to load vars for managed_node3 12081 1726882434.71817: Calling groups_inventory to load vars for managed_node3 12081 1726882434.71818: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882434.71824: Calling all_plugins_play to load vars for managed_node3 12081 1726882434.71826: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882434.71827: Calling groups_plugins_play to load vars for managed_node3 12081 1726882434.72643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882434.74305: done with get_vars() 12081 1726882434.74333: done getting variables 12081 1726882434.74388: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 21:33:54 -0400 (0:00:00.086) 0:00:54.547 ****** 12081 1726882434.74421: entering _queue_task() for managed_node3/command 12081 1726882434.74762: worker is 1 (out of 1 available) 12081 1726882434.74774: exiting _queue_task() for managed_node3/command 12081 1726882434.74787: done queuing things up, now waiting for results queue to drain 12081 1726882434.74789: waiting for pending results... 12081 1726882434.75088: running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings 12081 1726882434.75205: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000c2a 12081 1726882434.75226: variable 'ansible_search_path' from source: unknown 12081 1726882434.75235: variable 'ansible_search_path' from source: unknown 12081 1726882434.75289: variable 'bond_options_to_assert' from source: set_fact 12081 1726882434.75504: variable 'bond_options_to_assert' from source: set_fact 12081 1726882434.75619: variable 'omit' from source: magic vars 12081 1726882434.75763: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.75783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.75796: variable 'omit' from source: magic vars 12081 1726882434.76037: variable 'ansible_distribution_major_version' from source: facts 12081 1726882434.76052: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882434.76066: variable 'omit' from source: magic vars 12081 1726882434.76120: variable 'omit' from source: magic vars 12081 1726882434.76329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882434.78782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882434.78857: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882434.78899: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882434.78958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882434.78993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882434.79101: variable 'controller_device' from source: play vars 12081 1726882434.79113: variable 'bond_opt' from source: unknown 12081 1726882434.79146: variable 'omit' from source: magic vars 12081 1726882434.79187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882434.79217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882434.79239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882434.79269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.79285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882434.79321: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882434.79330: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.79337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.79445: Set connection var ansible_pipelining to False 12081 1726882434.79456: Set connection var ansible_shell_type to sh 12081 1726882434.79476: Set connection var ansible_shell_executable to /bin/sh 12081 1726882434.79484: Set connection var ansible_connection to ssh 12081 1726882434.79494: Set connection var ansible_timeout to 10 12081 1726882434.79503: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882434.79532: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.79540: variable 'ansible_connection' from source: unknown 12081 1726882434.79546: variable 'ansible_module_compression' from source: unknown 12081 1726882434.79556: variable 'ansible_shell_type' from source: unknown 12081 1726882434.79565: variable 'ansible_shell_executable' from source: unknown 12081 1726882434.79572: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882434.79584: variable 'ansible_pipelining' from source: unknown 12081 1726882434.79591: variable 'ansible_timeout' from source: unknown 12081 1726882434.79599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882434.79712: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882434.79728: variable 'omit' from source: magic vars 12081 1726882434.79737: starting attempt loop 12081 1726882434.79743: running the handler 12081 1726882434.79766: _low_level_execute_command(): starting 12081 1726882434.79777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882434.80513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.80527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.80541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.80567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.80616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.80628: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.80642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.80666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.80683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.80694: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.80705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.80717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.80731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.80743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.80756: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.80774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.80849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.80871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.80888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.81024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.82686: stdout chunk (state=3): >>>/root <<< 12081 1726882434.82786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.82862: stderr chunk (state=3): >>><<< 12081 1726882434.82867: stdout chunk (state=3): >>><<< 12081 1726882434.82892: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882434.82904: _low_level_execute_command(): starting 12081 1726882434.82911: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629 `" && echo ansible-tmp-1726882434.8289146-14541-223645178267629="` echo /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629 `" ) && sleep 0' 12081 1726882434.83652: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.83661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.83674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.83689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.83745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.83756: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.83764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.83780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.83787: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.83793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.83801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.83811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.83822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.83829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.83835: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.83849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.83923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.83943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.83951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.84329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.85982: stdout chunk (state=3): >>>ansible-tmp-1726882434.8289146-14541-223645178267629=/root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629 <<< 12081 1726882434.86099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.86189: stderr chunk (state=3): >>><<< 12081 1726882434.86201: stdout chunk (state=3): >>><<< 12081 1726882434.86469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882434.8289146-14541-223645178267629=/root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882434.86473: variable 'ansible_module_compression' from source: unknown 12081 1726882434.86475: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882434.86477: variable 'ansible_facts' from source: unknown 12081 1726882434.86479: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629/AnsiballZ_command.py 12081 1726882434.86596: Sending initial data 12081 1726882434.86599: Sent initial data (156 bytes) 12081 1726882434.87695: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.87716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.87733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.87753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.87799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.87812: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.87833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.87852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.87868: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.87881: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.87894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.87908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.87931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.87959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.87975: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.87990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.88074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.88125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.88223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.88356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.90116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882434.90212: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882434.90318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmprfqhgtzy /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629/AnsiballZ_command.py <<< 12081 1726882434.90417: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882434.91932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.92053: stderr chunk (state=3): >>><<< 12081 1726882434.92057: stdout chunk (state=3): >>><<< 12081 1726882434.92059: done transferring module to remote 12081 1726882434.92061: _low_level_execute_command(): starting 12081 1726882434.92066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629/ /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629/AnsiballZ_command.py && sleep 0' 12081 1726882434.92637: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.92650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.92668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.92689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.92736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.92749: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.92767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.92786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.92797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.92810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.92826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.92839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.92854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.92868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.92881: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.92895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.92974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.92990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.93005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.93136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882434.95136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882434.95139: stdout chunk (state=3): >>><<< 12081 1726882434.95141: stderr chunk (state=3): >>><<< 12081 1726882434.95236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882434.95239: _low_level_execute_command(): starting 12081 1726882434.95242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629/AnsiballZ_command.py && sleep 0' 12081 1726882434.95833: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882434.95845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.95859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.95878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.95926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.95937: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882434.95949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.95967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882434.95978: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882434.95987: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882434.95998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882434.96014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882434.96032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882434.96042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882434.96052: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882434.96067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882434.96149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882434.96171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882434.96186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882434.96318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.10037: stdout chunk (state=3): >>> {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:33:55.096053", "end": "2024-09-20 21:33:55.098869", "delta": "0:00:00.002816", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882435.11266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882435.11270: stdout chunk (state=3): >>><<< 12081 1726882435.11273: stderr chunk (state=3): >>><<< 12081 1726882435.11409: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:33:55.096053", "end": "2024-09-20 21:33:55.098869", "delta": "0:00:00.002816", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882435.11419: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882435.11422: _low_level_execute_command(): starting 12081 1726882435.11424: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882434.8289146-14541-223645178267629/ > /dev/null 2>&1 && sleep 0' 12081 1726882435.13572: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.13576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.13579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.13581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.13584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.13586: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.13588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.13591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.13593: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.13595: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.13597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.13599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.13601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.13602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.13604: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.13606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.13679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.13740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.13749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.13956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.15782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.15868: stderr chunk (state=3): >>><<< 12081 1726882435.15872: stdout chunk (state=3): >>><<< 12081 1726882435.15890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.15897: handler run complete 12081 1726882435.15922: Evaluated conditional (False): False 12081 1726882435.16094: variable 'bond_opt' from source: unknown 12081 1726882435.16100: variable 'result' from source: set_fact 12081 1726882435.16116: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882435.16127: attempt loop complete, returning result 12081 1726882435.16146: variable 'bond_opt' from source: unknown 12081 1726882435.16220: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'mode', 'value': 'active-backup'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "active-backup" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.002816", "end": "2024-09-20 21:33:55.098869", "rc": 0, "start": "2024-09-20 21:33:55.096053" } STDOUT: active-backup 1 12081 1726882435.16433: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.16436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.16439: variable 'omit' from source: magic vars 12081 1726882435.16576: variable 'ansible_distribution_major_version' from source: facts 12081 1726882435.16581: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882435.16586: variable 'omit' from source: magic vars 12081 1726882435.16600: variable 'omit' from source: magic vars 12081 1726882435.16999: variable 'controller_device' from source: play vars 12081 1726882435.17002: variable 'bond_opt' from source: unknown 12081 1726882435.17022: variable 'omit' from source: magic vars 12081 1726882435.17045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882435.17166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882435.17176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882435.17189: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882435.17192: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.17194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.17389: Set connection var ansible_pipelining to False 12081 1726882435.17392: Set connection var ansible_shell_type to sh 12081 1726882435.17399: Set connection var ansible_shell_executable to /bin/sh 12081 1726882435.17402: Set connection var ansible_connection to ssh 12081 1726882435.17407: Set connection var ansible_timeout to 10 12081 1726882435.17412: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882435.17436: variable 'ansible_shell_executable' from source: unknown 12081 1726882435.17439: variable 'ansible_connection' from source: unknown 12081 1726882435.17441: variable 'ansible_module_compression' from source: unknown 12081 1726882435.17443: variable 'ansible_shell_type' from source: unknown 12081 1726882435.17446: variable 'ansible_shell_executable' from source: unknown 12081 1726882435.17448: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.17455: variable 'ansible_pipelining' from source: unknown 12081 1726882435.17457: variable 'ansible_timeout' from source: unknown 12081 1726882435.17459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.17667: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882435.17674: variable 'omit' from source: magic vars 12081 1726882435.17678: starting attempt loop 12081 1726882435.17681: running the handler 12081 1726882435.17689: _low_level_execute_command(): starting 12081 1726882435.17692: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882435.18547: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.18551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.18595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882435.18598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12081 1726882435.18616: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.18621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.18633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882435.18638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.18721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.18735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.18741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.18871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.20734: stdout chunk (state=3): >>>/root <<< 12081 1726882435.20945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.20949: stderr chunk (state=3): >>><<< 12081 1726882435.20952: stdout chunk (state=3): >>><<< 12081 1726882435.21072: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.21075: _low_level_execute_command(): starting 12081 1726882435.21078: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515 `" && echo ansible-tmp-1726882435.2097712-14541-60864634269515="` echo /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515 `" ) && sleep 0' 12081 1726882435.22015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.22024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.22034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.22052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.22096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.22104: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.22114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.22128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.22135: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.22141: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.22147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.22158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.22170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.22177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.22183: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.22192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.22265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.22283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.22295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.22423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.24309: stdout chunk (state=3): >>>ansible-tmp-1726882435.2097712-14541-60864634269515=/root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515 <<< 12081 1726882435.24469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.24524: stderr chunk (state=3): >>><<< 12081 1726882435.24528: stdout chunk (state=3): >>><<< 12081 1726882435.24575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882435.2097712-14541-60864634269515=/root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.24582: variable 'ansible_module_compression' from source: unknown 12081 1726882435.24622: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882435.24641: variable 'ansible_facts' from source: unknown 12081 1726882435.24706: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515/AnsiballZ_command.py 12081 1726882435.24902: Sending initial data 12081 1726882435.24906: Sent initial data (155 bytes) 12081 1726882435.25926: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.25935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.25950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.25961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.26003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.26013: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.26023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.26036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.26044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.26058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.26061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.26074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.26085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.26092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.26099: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.26108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.26186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.26201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.26204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.26336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.28091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882435.28191: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882435.28293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp4enzpxxg /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515/AnsiballZ_command.py <<< 12081 1726882435.28410: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882435.29739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.29909: stderr chunk (state=3): >>><<< 12081 1726882435.29912: stdout chunk (state=3): >>><<< 12081 1726882435.29914: done transferring module to remote 12081 1726882435.29916: _low_level_execute_command(): starting 12081 1726882435.29919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515/ /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515/AnsiballZ_command.py && sleep 0' 12081 1726882435.30630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.30645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.30668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.30690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.31539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.31556: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.31644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.31689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.31701: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.31712: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.31746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.31767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.31785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.31798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.31810: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.31846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.31957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.31978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.31999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.32192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.34015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.34018: stdout chunk (state=3): >>><<< 12081 1726882435.34021: stderr chunk (state=3): >>><<< 12081 1726882435.34116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.34123: _low_level_execute_command(): starting 12081 1726882435.34126: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515/AnsiballZ_command.py && sleep 0' 12081 1726882435.34718: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.34731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.34743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.34762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.34810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.34821: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.34833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.34849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.34862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.34874: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.34885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.34902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.34917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.34928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.34938: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.34950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.35035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.35059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.35077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.35212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.48858: stdout chunk (state=3): >>> {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 21:33:55.484125", "end": "2024-09-20 21:33:55.487135", "delta": "0:00:00.003010", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882435.50085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882435.50141: stderr chunk (state=3): >>><<< 12081 1726882435.50144: stdout chunk (state=3): >>><<< 12081 1726882435.50279: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 21:33:55.484125", "end": "2024-09-20 21:33:55.487135", "delta": "0:00:00.003010", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882435.50283: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882435.50286: _low_level_execute_command(): starting 12081 1726882435.50288: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882435.2097712-14541-60864634269515/ > /dev/null 2>&1 && sleep 0' 12081 1726882435.50904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.50917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.50930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.50955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.51001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.51015: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.51031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.51049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.51072: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.51084: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.51097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.51111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.51126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.51273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.51296: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.51310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.51405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.51425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.51443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.51587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.53519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.53523: stdout chunk (state=3): >>><<< 12081 1726882435.53526: stderr chunk (state=3): >>><<< 12081 1726882435.53769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.53773: handler run complete 12081 1726882435.53775: Evaluated conditional (False): False 12081 1726882435.53777: variable 'bond_opt' from source: unknown 12081 1726882435.53779: variable 'result' from source: set_fact 12081 1726882435.53781: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882435.53782: attempt loop complete, returning result 12081 1726882435.53795: variable 'bond_opt' from source: unknown 12081 1726882435.53862: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_interval', 'value': '60'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_interval", "value": "60" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_interval" ], "delta": "0:00:00.003010", "end": "2024-09-20 21:33:55.487135", "rc": 0, "start": "2024-09-20 21:33:55.484125" } STDOUT: 60 12081 1726882435.54097: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.54109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.54121: variable 'omit' from source: magic vars 12081 1726882435.54324: variable 'ansible_distribution_major_version' from source: facts 12081 1726882435.54369: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882435.54437: variable 'omit' from source: magic vars 12081 1726882435.54486: variable 'omit' from source: magic vars 12081 1726882435.54676: variable 'controller_device' from source: play vars 12081 1726882435.54714: variable 'bond_opt' from source: unknown 12081 1726882435.54736: variable 'omit' from source: magic vars 12081 1726882435.55533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882435.55548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882435.55623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882435.55648: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882435.55667: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.55676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.55816: Set connection var ansible_pipelining to False 12081 1726882435.56370: Set connection var ansible_shell_type to sh 12081 1726882435.56373: Set connection var ansible_shell_executable to /bin/sh 12081 1726882435.56375: Set connection var ansible_connection to ssh 12081 1726882435.56377: Set connection var ansible_timeout to 10 12081 1726882435.56379: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882435.56381: variable 'ansible_shell_executable' from source: unknown 12081 1726882435.56383: variable 'ansible_connection' from source: unknown 12081 1726882435.56385: variable 'ansible_module_compression' from source: unknown 12081 1726882435.56386: variable 'ansible_shell_type' from source: unknown 12081 1726882435.56388: variable 'ansible_shell_executable' from source: unknown 12081 1726882435.56390: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.56391: variable 'ansible_pipelining' from source: unknown 12081 1726882435.56393: variable 'ansible_timeout' from source: unknown 12081 1726882435.56395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.56397: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882435.56399: variable 'omit' from source: magic vars 12081 1726882435.56401: starting attempt loop 12081 1726882435.56403: running the handler 12081 1726882435.56404: _low_level_execute_command(): starting 12081 1726882435.56407: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882435.56959: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.56968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.56979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.56993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.57030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.57037: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.57047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.57059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.57071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.57078: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.57086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.57095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.57106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.57113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.57119: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.57129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.57200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.57216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.57219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.57879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.58999: stdout chunk (state=3): >>>/root <<< 12081 1726882435.59174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.59177: stdout chunk (state=3): >>><<< 12081 1726882435.59184: stderr chunk (state=3): >>><<< 12081 1726882435.59201: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.59209: _low_level_execute_command(): starting 12081 1726882435.59214: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746 `" && echo ansible-tmp-1726882435.5920012-14541-205900204699746="` echo /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746 `" ) && sleep 0' 12081 1726882435.60880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.60884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.61000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.61004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.61072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.61077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.61288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.61303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.61308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.61444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.63336: stdout chunk (state=3): >>>ansible-tmp-1726882435.5920012-14541-205900204699746=/root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746 <<< 12081 1726882435.63501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.63555: stderr chunk (state=3): >>><<< 12081 1726882435.63559: stdout chunk (state=3): >>><<< 12081 1726882435.63679: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882435.5920012-14541-205900204699746=/root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.63682: variable 'ansible_module_compression' from source: unknown 12081 1726882435.63684: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882435.63687: variable 'ansible_facts' from source: unknown 12081 1726882435.63769: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746/AnsiballZ_command.py 12081 1726882435.63930: Sending initial data 12081 1726882435.63933: Sent initial data (156 bytes) 12081 1726882435.64888: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.64896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.64906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.64918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.64957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.64961: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.64972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.64990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.64996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.65002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.65010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.65018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.65028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.65035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.65041: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.65050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.65135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.65161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.65166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.65340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.67040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882435.67138: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882435.67242: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp3vbwy0t7 /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746/AnsiballZ_command.py <<< 12081 1726882435.67337: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882435.69126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.69270: stderr chunk (state=3): >>><<< 12081 1726882435.69274: stdout chunk (state=3): >>><<< 12081 1726882435.69276: done transferring module to remote 12081 1726882435.69278: _low_level_execute_command(): starting 12081 1726882435.69280: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746/ /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746/AnsiballZ_command.py && sleep 0' 12081 1726882435.69849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.69855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.69893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882435.69896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.69899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882435.69901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.69966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.70466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.70594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.72361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.72429: stderr chunk (state=3): >>><<< 12081 1726882435.72433: stdout chunk (state=3): >>><<< 12081 1726882435.72546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.72550: _low_level_execute_command(): starting 12081 1726882435.72552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746/AnsiballZ_command.py && sleep 0' 12081 1726882435.73231: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.73250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.73269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.73287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.73346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.73361: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.73381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.73398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.73408: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.73428: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.73443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.73458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.73477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.73495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.73507: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.73519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.73616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.73647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.73673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.74084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.87652: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 21:33:55.872264", "end": "2024-09-20 21:33:55.875085", "delta": "0:00:00.002821", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882435.88891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882435.88929: stderr chunk (state=3): >>><<< 12081 1726882435.88932: stdout chunk (state=3): >>><<< 12081 1726882435.89065: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 21:33:55.872264", "end": "2024-09-20 21:33:55.875085", "delta": "0:00:00.002821", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882435.89074: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_ip_target', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882435.89076: _low_level_execute_command(): starting 12081 1726882435.89079: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882435.5920012-14541-205900204699746/ > /dev/null 2>&1 && sleep 0' 12081 1726882435.89685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882435.89700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.89716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.89741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.89796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.89809: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882435.89824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.89844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882435.89866: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882435.89879: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882435.89891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.89905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.89920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.89932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882435.89943: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882435.89964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.90040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.90074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.90092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.90225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.92191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.92194: stdout chunk (state=3): >>><<< 12081 1726882435.92197: stderr chunk (state=3): >>><<< 12081 1726882435.92274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.92277: handler run complete 12081 1726882435.92280: Evaluated conditional (False): False 12081 1726882435.92415: variable 'bond_opt' from source: unknown 12081 1726882435.92427: variable 'result' from source: set_fact 12081 1726882435.92446: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882435.92469: attempt loop complete, returning result 12081 1726882435.92494: variable 'bond_opt' from source: unknown 12081 1726882435.92586: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_ip_target', 'value': '192.0.2.128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_ip_target", "value": "192.0.2.128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_ip_target" ], "delta": "0:00:00.002821", "end": "2024-09-20 21:33:55.875085", "rc": 0, "start": "2024-09-20 21:33:55.872264" } STDOUT: 192.0.2.128 12081 1726882435.92813: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.92828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.92842: variable 'omit' from source: magic vars 12081 1726882435.93199: variable 'ansible_distribution_major_version' from source: facts 12081 1726882435.93211: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882435.93219: variable 'omit' from source: magic vars 12081 1726882435.93239: variable 'omit' from source: magic vars 12081 1726882435.93407: variable 'controller_device' from source: play vars 12081 1726882435.93479: variable 'bond_opt' from source: unknown 12081 1726882435.93503: variable 'omit' from source: magic vars 12081 1726882435.93528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882435.93581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882435.93593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882435.93611: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882435.93986: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.93996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.94286: Set connection var ansible_pipelining to False 12081 1726882435.94295: Set connection var ansible_shell_type to sh 12081 1726882435.94308: Set connection var ansible_shell_executable to /bin/sh 12081 1726882435.94315: Set connection var ansible_connection to ssh 12081 1726882435.94325: Set connection var ansible_timeout to 10 12081 1726882435.94335: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882435.94383: variable 'ansible_shell_executable' from source: unknown 12081 1726882435.94393: variable 'ansible_connection' from source: unknown 12081 1726882435.94401: variable 'ansible_module_compression' from source: unknown 12081 1726882435.94408: variable 'ansible_shell_type' from source: unknown 12081 1726882435.94414: variable 'ansible_shell_executable' from source: unknown 12081 1726882435.94421: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882435.94433: variable 'ansible_pipelining' from source: unknown 12081 1726882435.94440: variable 'ansible_timeout' from source: unknown 12081 1726882435.94448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882435.94548: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882435.94681: variable 'omit' from source: magic vars 12081 1726882435.94691: starting attempt loop 12081 1726882435.94698: running the handler 12081 1726882435.94710: _low_level_execute_command(): starting 12081 1726882435.94718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882435.96022: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882435.96026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882435.96068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.96071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882435.96074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882435.96127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882435.96139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882435.96150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882435.96289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882435.97878: stdout chunk (state=3): >>>/root <<< 12081 1726882435.98074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882435.98077: stdout chunk (state=3): >>><<< 12081 1726882435.98080: stderr chunk (state=3): >>><<< 12081 1726882435.98184: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882435.98187: _low_level_execute_command(): starting 12081 1726882435.98190: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606 `" && echo ansible-tmp-1726882435.980966-14541-187501588359606="` echo /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606 `" ) && sleep 0' 12081 1726882436.00340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.00344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.00381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.00385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882436.00583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.00659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.00665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.00799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.00900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.02776: stdout chunk (state=3): >>>ansible-tmp-1726882435.980966-14541-187501588359606=/root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606 <<< 12081 1726882436.02903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.02981: stderr chunk (state=3): >>><<< 12081 1726882436.02985: stdout chunk (state=3): >>><<< 12081 1726882436.03232: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882435.980966-14541-187501588359606=/root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.03236: variable 'ansible_module_compression' from source: unknown 12081 1726882436.03238: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882436.03240: variable 'ansible_facts' from source: unknown 12081 1726882436.03242: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606/AnsiballZ_command.py 12081 1726882436.03689: Sending initial data 12081 1726882436.03698: Sent initial data (155 bytes) 12081 1726882436.05792: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.05807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.05821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.05839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.05891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.05903: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.05915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.05931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.05942: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.05952: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.05971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.05984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.05998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.06008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.06018: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.06029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.06112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.06134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.06148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.06286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.08027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882436.08124: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882436.08227: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp8v5fmem6 /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606/AnsiballZ_command.py <<< 12081 1726882436.08323: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882436.09876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.09962: stderr chunk (state=3): >>><<< 12081 1726882436.09971: stdout chunk (state=3): >>><<< 12081 1726882436.09989: done transferring module to remote 12081 1726882436.09996: _low_level_execute_command(): starting 12081 1726882436.10001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606/ /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606/AnsiballZ_command.py && sleep 0' 12081 1726882436.11025: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.11029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.11075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.11079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882436.11090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.11104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.11109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.11189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.11208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.11213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.11334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.13122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.13161: stderr chunk (state=3): >>><<< 12081 1726882436.13168: stdout chunk (state=3): >>><<< 12081 1726882436.13183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.13186: _low_level_execute_command(): starting 12081 1726882436.13191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606/AnsiballZ_command.py && sleep 0' 12081 1726882436.14128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.14697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.14704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.14719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.14760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.14774: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.14784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.14796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.14807: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.14810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.14817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.14827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.14838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.14846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.14852: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.14862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.14939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.14957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.14967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.15134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.28895: stdout chunk (state=3): >>> {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 21:33:56.284729", "end": "2024-09-20 21:33:56.287537", "delta": "0:00:00.002808", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882436.30089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882436.30118: stderr chunk (state=3): >>><<< 12081 1726882436.30121: stdout chunk (state=3): >>><<< 12081 1726882436.30248: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 21:33:56.284729", "end": "2024-09-20 21:33:56.287537", "delta": "0:00:00.002808", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882436.30251: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_validate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882436.30254: _low_level_execute_command(): starting 12081 1726882436.30256: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882435.980966-14541-187501588359606/ > /dev/null 2>&1 && sleep 0' 12081 1726882436.30905: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.30930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.30946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.30967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.31011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.31030: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.31049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.31070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.31084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.31096: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.31108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.31122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.31148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.31162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.31175: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.31189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.31276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.31299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.31315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.31446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.33283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.33391: stderr chunk (state=3): >>><<< 12081 1726882436.33402: stdout chunk (state=3): >>><<< 12081 1726882436.33675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.33684: handler run complete 12081 1726882436.33686: Evaluated conditional (False): False 12081 1726882436.33688: variable 'bond_opt' from source: unknown 12081 1726882436.33690: variable 'result' from source: set_fact 12081 1726882436.33692: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882436.33694: attempt loop complete, returning result 12081 1726882436.33696: variable 'bond_opt' from source: unknown 12081 1726882436.33750: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_validate', 'value': 'none'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_validate", "value": "none" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_validate" ], "delta": "0:00:00.002808", "end": "2024-09-20 21:33:56.287537", "rc": 0, "start": "2024-09-20 21:33:56.284729" } STDOUT: none 0 12081 1726882436.33982: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882436.33996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882436.34009: variable 'omit' from source: magic vars 12081 1726882436.34182: variable 'ansible_distribution_major_version' from source: facts 12081 1726882436.34193: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882436.34200: variable 'omit' from source: magic vars 12081 1726882436.34219: variable 'omit' from source: magic vars 12081 1726882436.34399: variable 'controller_device' from source: play vars 12081 1726882436.34409: variable 'bond_opt' from source: unknown 12081 1726882436.34431: variable 'omit' from source: magic vars 12081 1726882436.34468: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882436.34483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882436.34494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882436.34511: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882436.34518: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882436.34525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882436.34615: Set connection var ansible_pipelining to False 12081 1726882436.34623: Set connection var ansible_shell_type to sh 12081 1726882436.34634: Set connection var ansible_shell_executable to /bin/sh 12081 1726882436.34640: Set connection var ansible_connection to ssh 12081 1726882436.34648: Set connection var ansible_timeout to 10 12081 1726882436.34655: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882436.34691: variable 'ansible_shell_executable' from source: unknown 12081 1726882436.34697: variable 'ansible_connection' from source: unknown 12081 1726882436.34704: variable 'ansible_module_compression' from source: unknown 12081 1726882436.34709: variable 'ansible_shell_type' from source: unknown 12081 1726882436.34715: variable 'ansible_shell_executable' from source: unknown 12081 1726882436.34722: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882436.34728: variable 'ansible_pipelining' from source: unknown 12081 1726882436.34735: variable 'ansible_timeout' from source: unknown 12081 1726882436.34742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882436.34844: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882436.34857: variable 'omit' from source: magic vars 12081 1726882436.34867: starting attempt loop 12081 1726882436.34874: running the handler 12081 1726882436.34893: _low_level_execute_command(): starting 12081 1726882436.34903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882436.35613: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.35629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.35644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.35675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.35720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.35731: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.35744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.35772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.35786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.35796: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.35806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.35819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.35835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.35848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.35860: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.35885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.35961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.35996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.36014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.36145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.37733: stdout chunk (state=3): >>>/root <<< 12081 1726882436.37885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.37938: stderr chunk (state=3): >>><<< 12081 1726882436.37941: stdout chunk (state=3): >>><<< 12081 1726882436.37981: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.38062: _low_level_execute_command(): starting 12081 1726882436.38072: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677 `" && echo ansible-tmp-1726882436.3795927-14541-214048881162677="` echo /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677 `" ) && sleep 0' 12081 1726882436.38673: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.38687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.38702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.38724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.38771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.38833: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.38847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.38867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.38878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.38888: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.38897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.38908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.38921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.38935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.38945: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.38957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.39027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.39174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.39191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.39322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.41192: stdout chunk (state=3): >>>ansible-tmp-1726882436.3795927-14541-214048881162677=/root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677 <<< 12081 1726882436.41396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.41400: stdout chunk (state=3): >>><<< 12081 1726882436.41402: stderr chunk (state=3): >>><<< 12081 1726882436.41472: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882436.3795927-14541-214048881162677=/root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.41476: variable 'ansible_module_compression' from source: unknown 12081 1726882436.41666: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882436.41671: variable 'ansible_facts' from source: unknown 12081 1726882436.41674: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677/AnsiballZ_command.py 12081 1726882436.41746: Sending initial data 12081 1726882436.41749: Sent initial data (156 bytes) 12081 1726882436.42821: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.42834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.42846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.42874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.42912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.42923: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.42934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.42949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.42965: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.42983: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.42994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.43005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.43017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.43027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.43036: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.43047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.43131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.43156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.43180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.43322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.45085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882436.45181: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882436.45284: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpdzkd4unv /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677/AnsiballZ_command.py <<< 12081 1726882436.45381: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882436.47047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.47186: stderr chunk (state=3): >>><<< 12081 1726882436.47190: stdout chunk (state=3): >>><<< 12081 1726882436.47192: done transferring module to remote 12081 1726882436.47194: _low_level_execute_command(): starting 12081 1726882436.47197: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677/ /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677/AnsiballZ_command.py && sleep 0' 12081 1726882436.49110: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.49115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.49149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.49152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.49157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.49220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.49989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.50194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.51944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.52020: stderr chunk (state=3): >>><<< 12081 1726882436.52024: stdout chunk (state=3): >>><<< 12081 1726882436.52123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.52127: _low_level_execute_command(): starting 12081 1726882436.52129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677/AnsiballZ_command.py && sleep 0' 12081 1726882436.53238: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.53242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.53271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882436.53274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.53276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.53356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.53360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.54304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.54407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.68361: stdout chunk (state=3): >>> {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 21:33:56.679540", "end": "2024-09-20 21:33:56.682293", "delta": "0:00:00.002753", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882436.69443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882436.69530: stderr chunk (state=3): >>><<< 12081 1726882436.69534: stdout chunk (state=3): >>><<< 12081 1726882436.69614: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 21:33:56.679540", "end": "2024-09-20 21:33:56.682293", "delta": "0:00:00.002753", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882436.69701: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/primary', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882436.69709: _low_level_execute_command(): starting 12081 1726882436.69714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882436.3795927-14541-214048881162677/ > /dev/null 2>&1 && sleep 0' 12081 1726882436.70368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.70389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.70403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.70421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.70470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.70488: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.70502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.70519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.70529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.70539: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.70549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.70561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.70579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.70591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.70605: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.70617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.70702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.70730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.70746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.70877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.72680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.72784: stderr chunk (state=3): >>><<< 12081 1726882436.72802: stdout chunk (state=3): >>><<< 12081 1726882436.73074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.73078: handler run complete 12081 1726882436.73081: Evaluated conditional (False): False 12081 1726882436.73083: variable 'bond_opt' from source: unknown 12081 1726882436.73085: variable 'result' from source: set_fact 12081 1726882436.73086: Evaluated conditional (bond_opt.value in result.stdout): True 12081 1726882436.73088: attempt loop complete, returning result 12081 1726882436.73090: variable 'bond_opt' from source: unknown 12081 1726882436.73155: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'primary', 'value': 'test1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "primary", "value": "test1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/primary" ], "delta": "0:00:00.002753", "end": "2024-09-20 21:33:56.682293", "rc": 0, "start": "2024-09-20 21:33:56.679540" } STDOUT: test1 12081 1726882436.73345: dumping result to json 12081 1726882436.73371: done dumping result, returning 12081 1726882436.73386: done running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings [0e448fcc-3ce9-0a3f-ff3c-000000000c2a] 12081 1726882436.73397: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000c2a 12081 1726882436.73700: no more pending results, returning what we have 12081 1726882436.73705: results queue empty 12081 1726882436.73706: checking for any_errors_fatal 12081 1726882436.73708: done checking for any_errors_fatal 12081 1726882436.73709: checking for max_fail_percentage 12081 1726882436.73711: done checking for max_fail_percentage 12081 1726882436.73712: checking to see if all hosts have failed and the running result is not ok 12081 1726882436.73713: done checking to see if all hosts have failed 12081 1726882436.73714: getting the remaining hosts for this loop 12081 1726882436.73716: done getting the remaining hosts for this loop 12081 1726882436.73720: getting the next task for host managed_node3 12081 1726882436.73728: done getting next task for host managed_node3 12081 1726882436.73731: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 12081 1726882436.73736: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882436.73740: getting variables 12081 1726882436.73742: in VariableManager get_vars() 12081 1726882436.73789: Calling all_inventory to load vars for managed_node3 12081 1726882436.73793: Calling groups_inventory to load vars for managed_node3 12081 1726882436.73796: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882436.73810: Calling all_plugins_play to load vars for managed_node3 12081 1726882436.73813: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882436.73817: Calling groups_plugins_play to load vars for managed_node3 12081 1726882436.74671: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000c2a 12081 1726882436.74675: WORKER PROCESS EXITING 12081 1726882436.75779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882436.77593: done with get_vars() 12081 1726882436.77625: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 21:33:56 -0400 (0:00:02.033) 0:00:56.580 ****** 12081 1726882436.77753: entering _queue_task() for managed_node3/include_tasks 12081 1726882436.78277: worker is 1 (out of 1 available) 12081 1726882436.78291: exiting _queue_task() for managed_node3/include_tasks 12081 1726882436.78305: done queuing things up, now waiting for results queue to drain 12081 1726882436.78307: waiting for pending results... 12081 1726882436.78622: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' 12081 1726882436.78758: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000c2c 12081 1726882436.78784: variable 'ansible_search_path' from source: unknown 12081 1726882436.78792: variable 'ansible_search_path' from source: unknown 12081 1726882436.78832: calling self._execute() 12081 1726882436.78933: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882436.78943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882436.78954: variable 'omit' from source: magic vars 12081 1726882436.79337: variable 'ansible_distribution_major_version' from source: facts 12081 1726882436.79355: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882436.79367: _execute() done 12081 1726882436.79376: dumping result to json 12081 1726882436.79383: done dumping result, returning 12081 1726882436.79399: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000c2c] 12081 1726882436.79415: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000c2c 12081 1726882436.79551: no more pending results, returning what we have 12081 1726882436.79556: in VariableManager get_vars() 12081 1726882436.79609: Calling all_inventory to load vars for managed_node3 12081 1726882436.79613: Calling groups_inventory to load vars for managed_node3 12081 1726882436.79615: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882436.79633: Calling all_plugins_play to load vars for managed_node3 12081 1726882436.79637: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882436.79640: Calling groups_plugins_play to load vars for managed_node3 12081 1726882436.80728: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000c2c 12081 1726882436.80731: WORKER PROCESS EXITING 12081 1726882436.81446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882436.83224: done with get_vars() 12081 1726882436.83258: variable 'ansible_search_path' from source: unknown 12081 1726882436.83260: variable 'ansible_search_path' from source: unknown 12081 1726882436.83272: variable 'item' from source: include params 12081 1726882436.83400: variable 'item' from source: include params 12081 1726882436.83437: we have included files to process 12081 1726882436.83438: generating all_blocks data 12081 1726882436.83441: done generating all_blocks data 12081 1726882436.83453: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12081 1726882436.83454: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12081 1726882436.83458: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12081 1726882436.83671: done processing included file 12081 1726882436.83674: iterating over new_blocks loaded from include file 12081 1726882436.83676: in VariableManager get_vars() 12081 1726882436.83700: done with get_vars() 12081 1726882436.83702: filtering new block on tags 12081 1726882436.83729: done filtering new block on tags 12081 1726882436.83732: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node3 12081 1726882436.83737: extending task lists for all hosts with included blocks 12081 1726882436.83983: done extending task lists 12081 1726882436.83985: done processing included files 12081 1726882436.83986: results queue empty 12081 1726882436.83986: checking for any_errors_fatal 12081 1726882436.84000: done checking for any_errors_fatal 12081 1726882436.84001: checking for max_fail_percentage 12081 1726882436.84002: done checking for max_fail_percentage 12081 1726882436.84003: checking to see if all hosts have failed and the running result is not ok 12081 1726882436.84004: done checking to see if all hosts have failed 12081 1726882436.84005: getting the remaining hosts for this loop 12081 1726882436.84007: done getting the remaining hosts for this loop 12081 1726882436.84010: getting the next task for host managed_node3 12081 1726882436.84014: done getting next task for host managed_node3 12081 1726882436.84016: ^ task is: TASK: ** TEST check IPv4 12081 1726882436.84019: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882436.84021: getting variables 12081 1726882436.84022: in VariableManager get_vars() 12081 1726882436.84036: Calling all_inventory to load vars for managed_node3 12081 1726882436.84039: Calling groups_inventory to load vars for managed_node3 12081 1726882436.84041: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882436.84047: Calling all_plugins_play to load vars for managed_node3 12081 1726882436.84050: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882436.84053: Calling groups_plugins_play to load vars for managed_node3 12081 1726882436.85612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882436.87327: done with get_vars() 12081 1726882436.87358: done getting variables 12081 1726882436.87410: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 21:33:56 -0400 (0:00:00.096) 0:00:56.677 ****** 12081 1726882436.87443: entering _queue_task() for managed_node3/command 12081 1726882436.87795: worker is 1 (out of 1 available) 12081 1726882436.87812: exiting _queue_task() for managed_node3/command 12081 1726882436.87826: done queuing things up, now waiting for results queue to drain 12081 1726882436.87827: waiting for pending results... 12081 1726882436.88136: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 12081 1726882436.88270: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000da6 12081 1726882436.88292: variable 'ansible_search_path' from source: unknown 12081 1726882436.88299: variable 'ansible_search_path' from source: unknown 12081 1726882436.88338: calling self._execute() 12081 1726882436.88448: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882436.88458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882436.88482: variable 'omit' from source: magic vars 12081 1726882436.88851: variable 'ansible_distribution_major_version' from source: facts 12081 1726882436.88874: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882436.88886: variable 'omit' from source: magic vars 12081 1726882436.88954: variable 'omit' from source: magic vars 12081 1726882436.89144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882436.91906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882436.91971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882436.92001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882436.92029: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882436.92048: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882436.92121: variable 'interface' from source: include params 12081 1726882436.92124: variable 'controller_device' from source: play vars 12081 1726882436.92178: variable 'controller_device' from source: play vars 12081 1726882436.92198: variable 'omit' from source: magic vars 12081 1726882436.92224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882436.92244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882436.92261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882436.92276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882436.92285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882436.92310: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882436.92314: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882436.92316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882436.92387: Set connection var ansible_pipelining to False 12081 1726882436.92391: Set connection var ansible_shell_type to sh 12081 1726882436.92396: Set connection var ansible_shell_executable to /bin/sh 12081 1726882436.92399: Set connection var ansible_connection to ssh 12081 1726882436.92404: Set connection var ansible_timeout to 10 12081 1726882436.92413: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882436.92430: variable 'ansible_shell_executable' from source: unknown 12081 1726882436.92433: variable 'ansible_connection' from source: unknown 12081 1726882436.92435: variable 'ansible_module_compression' from source: unknown 12081 1726882436.92438: variable 'ansible_shell_type' from source: unknown 12081 1726882436.92440: variable 'ansible_shell_executable' from source: unknown 12081 1726882436.92442: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882436.92444: variable 'ansible_pipelining' from source: unknown 12081 1726882436.92447: variable 'ansible_timeout' from source: unknown 12081 1726882436.92455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882436.92525: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882436.92534: variable 'omit' from source: magic vars 12081 1726882436.92539: starting attempt loop 12081 1726882436.92541: running the handler 12081 1726882436.92554: _low_level_execute_command(): starting 12081 1726882436.92564: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882436.93074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.93085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.93105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882436.93118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.93128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.93175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.93188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.93198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.93306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.94963: stdout chunk (state=3): >>>/root <<< 12081 1726882436.95069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.95130: stderr chunk (state=3): >>><<< 12081 1726882436.95132: stdout chunk (state=3): >>><<< 12081 1726882436.95170: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.95183: _low_level_execute_command(): starting 12081 1726882436.95186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392 `" && echo ansible-tmp-1726882436.9514687-14647-86496724690392="` echo /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392 `" ) && sleep 0' 12081 1726882436.96320: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882436.96324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.96326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.96329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.96338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.96340: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882436.96374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.96394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882436.96401: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882436.96408: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882436.96415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882436.96424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882436.96439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882436.96559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882436.96562: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882436.96581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882436.96654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882436.96663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882436.96670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882436.97181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882436.99022: stdout chunk (state=3): >>>ansible-tmp-1726882436.9514687-14647-86496724690392=/root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392 <<< 12081 1726882436.99133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882436.99187: stderr chunk (state=3): >>><<< 12081 1726882436.99189: stdout chunk (state=3): >>><<< 12081 1726882436.99270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882436.9514687-14647-86496724690392=/root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882436.99274: variable 'ansible_module_compression' from source: unknown 12081 1726882436.99277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882436.99301: variable 'ansible_facts' from source: unknown 12081 1726882436.99360: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392/AnsiballZ_command.py 12081 1726882436.99467: Sending initial data 12081 1726882436.99470: Sent initial data (155 bytes) 12081 1726882437.00693: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.00701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.00710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.00724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.00761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.00770: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882437.00780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.00793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.00800: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882437.00808: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882437.00813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.00822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.00881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.00884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.00886: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882437.00888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.00928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.00973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.00975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.01092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.02840: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882437.02932: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882437.03034: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp716w95mx /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392/AnsiballZ_command.py <<< 12081 1726882437.03130: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882437.04638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.04718: stderr chunk (state=3): >>><<< 12081 1726882437.04721: stdout chunk (state=3): >>><<< 12081 1726882437.04752: done transferring module to remote 12081 1726882437.04757: _low_level_execute_command(): starting 12081 1726882437.04760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392/ /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392/AnsiballZ_command.py && sleep 0' 12081 1726882437.05418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.05427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.05438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.05451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.05497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.05505: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882437.05512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.05525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.05533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882437.05540: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882437.05547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.05558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.05569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.05577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.05584: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882437.05597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.05668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.05683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.05693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.05817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.07573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.07644: stderr chunk (state=3): >>><<< 12081 1726882437.07647: stdout chunk (state=3): >>><<< 12081 1726882437.07655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882437.07660: _low_level_execute_command(): starting 12081 1726882437.07666: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392/AnsiballZ_command.py && sleep 0' 12081 1726882437.08688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.08693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.08708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.08713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.08766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882437.08772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.08785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882437.08790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.08797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.08801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.08806: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882437.08818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.08914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.08927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.08933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.09088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.24848: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.201/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 239sec preferred_lft 239sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:57.243517", "end": "2024-09-20 21:33:57.246911", "delta": "0:00:00.003394", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882437.26093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882437.26148: stderr chunk (state=3): >>><<< 12081 1726882437.26151: stdout chunk (state=3): >>><<< 12081 1726882437.26275: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.201/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 239sec preferred_lft 239sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:57.243517", "end": "2024-09-20 21:33:57.246911", "delta": "0:00:00.003394", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882437.26284: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882437.26286: _low_level_execute_command(): starting 12081 1726882437.26288: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882436.9514687-14647-86496724690392/ > /dev/null 2>&1 && sleep 0' 12081 1726882437.26971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.26992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.27007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.27025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.27086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.27102: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882437.27118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.27137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.27158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882437.27174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882437.27186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.27205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.27222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.27235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.27249: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882437.27274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.27356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.27386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.27403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.27538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.29356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.29411: stderr chunk (state=3): >>><<< 12081 1726882437.29414: stdout chunk (state=3): >>><<< 12081 1726882437.29429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882437.29435: handler run complete 12081 1726882437.29454: Evaluated conditional (False): False 12081 1726882437.29615: variable 'address' from source: include params 12081 1726882437.29618: variable 'result' from source: set_fact 12081 1726882437.29621: Evaluated conditional (address in result.stdout): True 12081 1726882437.29628: attempt loop complete, returning result 12081 1726882437.29631: _execute() done 12081 1726882437.29633: dumping result to json 12081 1726882437.29642: done dumping result, returning 12081 1726882437.29666: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0e448fcc-3ce9-0a3f-ff3c-000000000da6] 12081 1726882437.29669: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000da6 12081 1726882437.29760: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000da6 12081 1726882437.29762: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003394", "end": "2024-09-20 21:33:57.246911", "rc": 0, "start": "2024-09-20 21:33:57.243517" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.201/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 239sec preferred_lft 239sec 12081 1726882437.29839: no more pending results, returning what we have 12081 1726882437.29843: results queue empty 12081 1726882437.29844: checking for any_errors_fatal 12081 1726882437.29846: done checking for any_errors_fatal 12081 1726882437.29846: checking for max_fail_percentage 12081 1726882437.29848: done checking for max_fail_percentage 12081 1726882437.29849: checking to see if all hosts have failed and the running result is not ok 12081 1726882437.29850: done checking to see if all hosts have failed 12081 1726882437.29851: getting the remaining hosts for this loop 12081 1726882437.29853: done getting the remaining hosts for this loop 12081 1726882437.29856: getting the next task for host managed_node3 12081 1726882437.29867: done getting next task for host managed_node3 12081 1726882437.29869: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 12081 1726882437.29872: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882437.29878: getting variables 12081 1726882437.29880: in VariableManager get_vars() 12081 1726882437.29921: Calling all_inventory to load vars for managed_node3 12081 1726882437.29924: Calling groups_inventory to load vars for managed_node3 12081 1726882437.29931: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882437.29942: Calling all_plugins_play to load vars for managed_node3 12081 1726882437.29944: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882437.29947: Calling groups_plugins_play to load vars for managed_node3 12081 1726882437.31347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882437.37841: done with get_vars() 12081 1726882437.37877: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 21:33:57 -0400 (0:00:00.505) 0:00:57.182 ****** 12081 1726882437.37968: entering _queue_task() for managed_node3/include_tasks 12081 1726882437.38324: worker is 1 (out of 1 available) 12081 1726882437.38337: exiting _queue_task() for managed_node3/include_tasks 12081 1726882437.38351: done queuing things up, now waiting for results queue to drain 12081 1726882437.38352: waiting for pending results... 12081 1726882437.38676: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' 12081 1726882437.38801: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000c2d 12081 1726882437.38823: variable 'ansible_search_path' from source: unknown 12081 1726882437.38831: variable 'ansible_search_path' from source: unknown 12081 1726882437.38879: calling self._execute() 12081 1726882437.39002: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.39018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.39034: variable 'omit' from source: magic vars 12081 1726882437.39424: variable 'ansible_distribution_major_version' from source: facts 12081 1726882437.39449: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882437.39464: _execute() done 12081 1726882437.39474: dumping result to json 12081 1726882437.39481: done dumping result, returning 12081 1726882437.39492: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' [0e448fcc-3ce9-0a3f-ff3c-000000000c2d] 12081 1726882437.39504: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000c2d 12081 1726882437.39645: no more pending results, returning what we have 12081 1726882437.39650: in VariableManager get_vars() 12081 1726882437.39709: Calling all_inventory to load vars for managed_node3 12081 1726882437.39712: Calling groups_inventory to load vars for managed_node3 12081 1726882437.39715: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882437.39730: Calling all_plugins_play to load vars for managed_node3 12081 1726882437.39733: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882437.39736: Calling groups_plugins_play to load vars for managed_node3 12081 1726882437.40794: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000c2d 12081 1726882437.40798: WORKER PROCESS EXITING 12081 1726882437.41537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882437.43319: done with get_vars() 12081 1726882437.43347: variable 'ansible_search_path' from source: unknown 12081 1726882437.43349: variable 'ansible_search_path' from source: unknown 12081 1726882437.43363: variable 'item' from source: include params 12081 1726882437.43481: variable 'item' from source: include params 12081 1726882437.43517: we have included files to process 12081 1726882437.43519: generating all_blocks data 12081 1726882437.43521: done generating all_blocks data 12081 1726882437.43527: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12081 1726882437.43528: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12081 1726882437.43530: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12081 1726882437.43729: done processing included file 12081 1726882437.43731: iterating over new_blocks loaded from include file 12081 1726882437.43733: in VariableManager get_vars() 12081 1726882437.43757: done with get_vars() 12081 1726882437.43759: filtering new block on tags 12081 1726882437.43792: done filtering new block on tags 12081 1726882437.43794: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node3 12081 1726882437.43800: extending task lists for all hosts with included blocks 12081 1726882437.44169: done extending task lists 12081 1726882437.44171: done processing included files 12081 1726882437.44172: results queue empty 12081 1726882437.44172: checking for any_errors_fatal 12081 1726882437.44179: done checking for any_errors_fatal 12081 1726882437.44180: checking for max_fail_percentage 12081 1726882437.44181: done checking for max_fail_percentage 12081 1726882437.44182: checking to see if all hosts have failed and the running result is not ok 12081 1726882437.44183: done checking to see if all hosts have failed 12081 1726882437.44183: getting the remaining hosts for this loop 12081 1726882437.44185: done getting the remaining hosts for this loop 12081 1726882437.44188: getting the next task for host managed_node3 12081 1726882437.44192: done getting next task for host managed_node3 12081 1726882437.44195: ^ task is: TASK: ** TEST check IPv6 12081 1726882437.44198: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882437.44201: getting variables 12081 1726882437.44202: in VariableManager get_vars() 12081 1726882437.44218: Calling all_inventory to load vars for managed_node3 12081 1726882437.44221: Calling groups_inventory to load vars for managed_node3 12081 1726882437.44223: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882437.44229: Calling all_plugins_play to load vars for managed_node3 12081 1726882437.44231: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882437.44234: Calling groups_plugins_play to load vars for managed_node3 12081 1726882437.45617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882437.47346: done with get_vars() 12081 1726882437.47381: done getting variables 12081 1726882437.47430: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 21:33:57 -0400 (0:00:00.094) 0:00:57.277 ****** 12081 1726882437.47469: entering _queue_task() for managed_node3/command 12081 1726882437.47821: worker is 1 (out of 1 available) 12081 1726882437.47832: exiting _queue_task() for managed_node3/command 12081 1726882437.47844: done queuing things up, now waiting for results queue to drain 12081 1726882437.47845: waiting for pending results... 12081 1726882437.48136: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 12081 1726882437.48266: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000dc7 12081 1726882437.48291: variable 'ansible_search_path' from source: unknown 12081 1726882437.48299: variable 'ansible_search_path' from source: unknown 12081 1726882437.48341: calling self._execute() 12081 1726882437.48458: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.48472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.48485: variable 'omit' from source: magic vars 12081 1726882437.48868: variable 'ansible_distribution_major_version' from source: facts 12081 1726882437.48886: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882437.48897: variable 'omit' from source: magic vars 12081 1726882437.48958: variable 'omit' from source: magic vars 12081 1726882437.49128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882437.51597: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882437.51679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882437.51732: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882437.51779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882437.51811: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882437.51905: variable 'controller_device' from source: play vars 12081 1726882437.51935: variable 'omit' from source: magic vars 12081 1726882437.51976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882437.52011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882437.52036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882437.52059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882437.52078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882437.52119: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882437.52127: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.52135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.52247: Set connection var ansible_pipelining to False 12081 1726882437.52258: Set connection var ansible_shell_type to sh 12081 1726882437.52275: Set connection var ansible_shell_executable to /bin/sh 12081 1726882437.52282: Set connection var ansible_connection to ssh 12081 1726882437.52292: Set connection var ansible_timeout to 10 12081 1726882437.52301: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882437.52339: variable 'ansible_shell_executable' from source: unknown 12081 1726882437.52347: variable 'ansible_connection' from source: unknown 12081 1726882437.52356: variable 'ansible_module_compression' from source: unknown 12081 1726882437.52365: variable 'ansible_shell_type' from source: unknown 12081 1726882437.52372: variable 'ansible_shell_executable' from source: unknown 12081 1726882437.52379: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.52386: variable 'ansible_pipelining' from source: unknown 12081 1726882437.52392: variable 'ansible_timeout' from source: unknown 12081 1726882437.52399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.52513: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882437.52534: variable 'omit' from source: magic vars 12081 1726882437.52545: starting attempt loop 12081 1726882437.52552: running the handler 12081 1726882437.52578: _low_level_execute_command(): starting 12081 1726882437.52588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882437.53345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.53362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.53379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.53396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.53441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.53456: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882437.53474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.53492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.53505: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882437.53522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882437.53535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.53549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.53572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.53585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.53598: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882437.53612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.53695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.53718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.53739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.53880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.55599: stdout chunk (state=3): >>>/root <<< 12081 1726882437.55695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.55791: stderr chunk (state=3): >>><<< 12081 1726882437.55804: stdout chunk (state=3): >>><<< 12081 1726882437.55933: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882437.55945: _low_level_execute_command(): starting 12081 1726882437.55948: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232 `" && echo ansible-tmp-1726882437.5583835-14677-223705098795232="` echo /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232 `" ) && sleep 0' 12081 1726882437.56539: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.56552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.56569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.56594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.56640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.56654: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882437.56673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.56696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.56713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882437.56739: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882437.56742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.56745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882437.56747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.56823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.56850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.56990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.58889: stdout chunk (state=3): >>>ansible-tmp-1726882437.5583835-14677-223705098795232=/root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232 <<< 12081 1726882437.58991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.59054: stderr chunk (state=3): >>><<< 12081 1726882437.59056: stdout chunk (state=3): >>><<< 12081 1726882437.59081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882437.5583835-14677-223705098795232=/root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882437.59101: variable 'ansible_module_compression' from source: unknown 12081 1726882437.59145: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882437.59170: variable 'ansible_facts' from source: unknown 12081 1726882437.59233: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232/AnsiballZ_command.py 12081 1726882437.59342: Sending initial data 12081 1726882437.59346: Sent initial data (156 bytes) 12081 1726882437.60278: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.60281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.60284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.60287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.60289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.60291: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882437.60294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.60300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.60303: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882437.60305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882437.60307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.60310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.60477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.60480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.60482: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882437.60485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.60487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.60489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.60491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.60569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.62310: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882437.62403: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882437.62505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp59oo12g6 /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232/AnsiballZ_command.py <<< 12081 1726882437.62597: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882437.63632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.63769: stderr chunk (state=3): >>><<< 12081 1726882437.63773: stdout chunk (state=3): >>><<< 12081 1726882437.63775: done transferring module to remote 12081 1726882437.63778: _low_level_execute_command(): starting 12081 1726882437.63780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232/ /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232/AnsiballZ_command.py && sleep 0' 12081 1726882437.64372: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882437.64378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.64391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.64408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.64447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.64462: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882437.64474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.64488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.64496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882437.64508: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882437.64515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.64524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.64537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.64544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882437.64551: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882437.64565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.64644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.64669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.64681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.64813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.66796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.66851: stderr chunk (state=3): >>><<< 12081 1726882437.66857: stdout chunk (state=3): >>><<< 12081 1726882437.66870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882437.66873: _low_level_execute_command(): starting 12081 1726882437.66880: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232/AnsiballZ_command.py && sleep 0' 12081 1726882437.67349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.67353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.67388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882437.67393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.67396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.67449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.67453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882437.67455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.67570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.81163: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::a2/128 scope global tentative dynamic noprefixroute \n valid_lft 240sec preferred_lft 240sec\n inet6 2001:db8::8f67:26d6:a5d0:40f6/64 scope global dynamic noprefixroute \n valid_lft 1800sec preferred_lft 1800sec\n inet6 fe80::93d3:44dd:dcdc:b1a8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:57.806994", "end": "2024-09-20 21:33:57.810172", "delta": "0:00:00.003178", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882437.82240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882437.82301: stderr chunk (state=3): >>><<< 12081 1726882437.82304: stdout chunk (state=3): >>><<< 12081 1726882437.82319: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::a2/128 scope global tentative dynamic noprefixroute \n valid_lft 240sec preferred_lft 240sec\n inet6 2001:db8::8f67:26d6:a5d0:40f6/64 scope global dynamic noprefixroute \n valid_lft 1800sec preferred_lft 1800sec\n inet6 fe80::93d3:44dd:dcdc:b1a8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:57.806994", "end": "2024-09-20 21:33:57.810172", "delta": "0:00:00.003178", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882437.82353: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882437.82361: _low_level_execute_command(): starting 12081 1726882437.82368: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882437.5583835-14677-223705098795232/ > /dev/null 2>&1 && sleep 0' 12081 1726882437.82835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882437.82839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882437.82877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882437.82880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.82894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882437.82899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882437.82911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882437.82920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882437.82970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882437.82982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882437.83090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882437.84877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882437.84928: stderr chunk (state=3): >>><<< 12081 1726882437.84931: stdout chunk (state=3): >>><<< 12081 1726882437.84944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882437.84951: handler run complete 12081 1726882437.84975: Evaluated conditional (False): False 12081 1726882437.85089: variable 'address' from source: include params 12081 1726882437.85092: variable 'result' from source: set_fact 12081 1726882437.85109: Evaluated conditional (address in result.stdout): True 12081 1726882437.85117: attempt loop complete, returning result 12081 1726882437.85120: _execute() done 12081 1726882437.85122: dumping result to json 12081 1726882437.85127: done dumping result, returning 12081 1726882437.85135: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0e448fcc-3ce9-0a3f-ff3c-000000000dc7] 12081 1726882437.85140: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000dc7 12081 1726882437.85244: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000dc7 12081 1726882437.85246: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003178", "end": "2024-09-20 21:33:57.810172", "rc": 0, "start": "2024-09-20 21:33:57.806994" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::a2/128 scope global tentative dynamic noprefixroute valid_lft 240sec preferred_lft 240sec inet6 2001:db8::8f67:26d6:a5d0:40f6/64 scope global dynamic noprefixroute valid_lft 1800sec preferred_lft 1800sec inet6 fe80::93d3:44dd:dcdc:b1a8/64 scope link noprefixroute valid_lft forever preferred_lft forever 12081 1726882437.85327: no more pending results, returning what we have 12081 1726882437.85331: results queue empty 12081 1726882437.85332: checking for any_errors_fatal 12081 1726882437.85333: done checking for any_errors_fatal 12081 1726882437.85334: checking for max_fail_percentage 12081 1726882437.85335: done checking for max_fail_percentage 12081 1726882437.85336: checking to see if all hosts have failed and the running result is not ok 12081 1726882437.85337: done checking to see if all hosts have failed 12081 1726882437.85338: getting the remaining hosts for this loop 12081 1726882437.85340: done getting the remaining hosts for this loop 12081 1726882437.85344: getting the next task for host managed_node3 12081 1726882437.85352: done getting next task for host managed_node3 12081 1726882437.85357: ^ task is: TASK: Conditional asserts 12081 1726882437.85359: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882437.85364: getting variables 12081 1726882437.85370: in VariableManager get_vars() 12081 1726882437.85410: Calling all_inventory to load vars for managed_node3 12081 1726882437.85413: Calling groups_inventory to load vars for managed_node3 12081 1726882437.85415: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882437.85426: Calling all_plugins_play to load vars for managed_node3 12081 1726882437.85428: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882437.85430: Calling groups_plugins_play to load vars for managed_node3 12081 1726882437.86273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882437.87302: done with get_vars() 12081 1726882437.87319: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:33:57 -0400 (0:00:00.399) 0:00:57.676 ****** 12081 1726882437.87394: entering _queue_task() for managed_node3/include_tasks 12081 1726882437.87627: worker is 1 (out of 1 available) 12081 1726882437.87640: exiting _queue_task() for managed_node3/include_tasks 12081 1726882437.87653: done queuing things up, now waiting for results queue to drain 12081 1726882437.87654: waiting for pending results... 12081 1726882437.87841: running TaskExecutor() for managed_node3/TASK: Conditional asserts 12081 1726882437.87914: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008f0 12081 1726882437.87925: variable 'ansible_search_path' from source: unknown 12081 1726882437.87928: variable 'ansible_search_path' from source: unknown 12081 1726882437.88146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882437.89752: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882437.89801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882437.89826: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882437.89865: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882437.89886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882437.89960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882437.89983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882437.90000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882437.90026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882437.90037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882437.90165: dumping result to json 12081 1726882437.90169: done dumping result, returning 12081 1726882437.90178: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0e448fcc-3ce9-0a3f-ff3c-0000000008f0] 12081 1726882437.90185: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008f0 12081 1726882437.90280: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008f0 12081 1726882437.90282: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 12081 1726882437.90327: no more pending results, returning what we have 12081 1726882437.90331: results queue empty 12081 1726882437.90331: checking for any_errors_fatal 12081 1726882437.90342: done checking for any_errors_fatal 12081 1726882437.90343: checking for max_fail_percentage 12081 1726882437.90345: done checking for max_fail_percentage 12081 1726882437.90346: checking to see if all hosts have failed and the running result is not ok 12081 1726882437.90347: done checking to see if all hosts have failed 12081 1726882437.90347: getting the remaining hosts for this loop 12081 1726882437.90349: done getting the remaining hosts for this loop 12081 1726882437.90355: getting the next task for host managed_node3 12081 1726882437.90361: done getting next task for host managed_node3 12081 1726882437.90366: ^ task is: TASK: Success in test '{{ lsr_description }}' 12081 1726882437.90369: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882437.90374: getting variables 12081 1726882437.90376: in VariableManager get_vars() 12081 1726882437.90418: Calling all_inventory to load vars for managed_node3 12081 1726882437.90421: Calling groups_inventory to load vars for managed_node3 12081 1726882437.90424: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882437.90434: Calling all_plugins_play to load vars for managed_node3 12081 1726882437.90436: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882437.90439: Calling groups_plugins_play to load vars for managed_node3 12081 1726882437.91294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882437.92252: done with get_vars() 12081 1726882437.92273: done getting variables 12081 1726882437.92318: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882437.92408: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:33:57 -0400 (0:00:00.050) 0:00:57.727 ****** 12081 1726882437.92433: entering _queue_task() for managed_node3/debug 12081 1726882437.92674: worker is 1 (out of 1 available) 12081 1726882437.92689: exiting _queue_task() for managed_node3/debug 12081 1726882437.92701: done queuing things up, now waiting for results queue to drain 12081 1726882437.92702: waiting for pending results... 12081 1726882437.92898: running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 12081 1726882437.92972: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008f1 12081 1726882437.92982: variable 'ansible_search_path' from source: unknown 12081 1726882437.92985: variable 'ansible_search_path' from source: unknown 12081 1726882437.93017: calling self._execute() 12081 1726882437.93130: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.93141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.93154: variable 'omit' from source: magic vars 12081 1726882437.93511: variable 'ansible_distribution_major_version' from source: facts 12081 1726882437.93530: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882437.93542: variable 'omit' from source: magic vars 12081 1726882437.93590: variable 'omit' from source: magic vars 12081 1726882437.93695: variable 'lsr_description' from source: include params 12081 1726882437.93719: variable 'omit' from source: magic vars 12081 1726882437.93768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882437.93806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882437.93830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882437.93850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882437.93868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882437.93904: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882437.93912: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.93920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.94021: Set connection var ansible_pipelining to False 12081 1726882437.94029: Set connection var ansible_shell_type to sh 12081 1726882437.94040: Set connection var ansible_shell_executable to /bin/sh 12081 1726882437.94046: Set connection var ansible_connection to ssh 12081 1726882437.94056: Set connection var ansible_timeout to 10 12081 1726882437.94067: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882437.94098: variable 'ansible_shell_executable' from source: unknown 12081 1726882437.94106: variable 'ansible_connection' from source: unknown 12081 1726882437.94114: variable 'ansible_module_compression' from source: unknown 12081 1726882437.94122: variable 'ansible_shell_type' from source: unknown 12081 1726882437.94129: variable 'ansible_shell_executable' from source: unknown 12081 1726882437.94136: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.94144: variable 'ansible_pipelining' from source: unknown 12081 1726882437.94151: variable 'ansible_timeout' from source: unknown 12081 1726882437.94160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.94302: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882437.94320: variable 'omit' from source: magic vars 12081 1726882437.94330: starting attempt loop 12081 1726882437.94337: running the handler 12081 1726882437.94390: handler run complete 12081 1726882437.94411: attempt loop complete, returning result 12081 1726882437.94418: _execute() done 12081 1726882437.94423: dumping result to json 12081 1726882437.94429: done dumping result, returning 12081 1726882437.94440: done running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0e448fcc-3ce9-0a3f-ff3c-0000000008f1] 12081 1726882437.94451: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008f1 12081 1726882437.94580: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008f1 12081 1726882437.94583: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 12081 1726882437.94651: no more pending results, returning what we have 12081 1726882437.94654: results queue empty 12081 1726882437.94655: checking for any_errors_fatal 12081 1726882437.94665: done checking for any_errors_fatal 12081 1726882437.94665: checking for max_fail_percentage 12081 1726882437.94667: done checking for max_fail_percentage 12081 1726882437.94668: checking to see if all hosts have failed and the running result is not ok 12081 1726882437.94669: done checking to see if all hosts have failed 12081 1726882437.94670: getting the remaining hosts for this loop 12081 1726882437.94671: done getting the remaining hosts for this loop 12081 1726882437.94675: getting the next task for host managed_node3 12081 1726882437.94682: done getting next task for host managed_node3 12081 1726882437.94684: ^ task is: TASK: Cleanup 12081 1726882437.94688: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882437.94693: getting variables 12081 1726882437.94694: in VariableManager get_vars() 12081 1726882437.94736: Calling all_inventory to load vars for managed_node3 12081 1726882437.94739: Calling groups_inventory to load vars for managed_node3 12081 1726882437.94741: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882437.94753: Calling all_plugins_play to load vars for managed_node3 12081 1726882437.94757: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882437.94760: Calling groups_plugins_play to load vars for managed_node3 12081 1726882437.95766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882437.96852: done with get_vars() 12081 1726882437.96876: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:33:57 -0400 (0:00:00.045) 0:00:57.772 ****** 12081 1726882437.96960: entering _queue_task() for managed_node3/include_tasks 12081 1726882437.97278: worker is 1 (out of 1 available) 12081 1726882437.97291: exiting _queue_task() for managed_node3/include_tasks 12081 1726882437.97303: done queuing things up, now waiting for results queue to drain 12081 1726882437.97305: waiting for pending results... 12081 1726882437.97591: running TaskExecutor() for managed_node3/TASK: Cleanup 12081 1726882437.97698: in run() - task 0e448fcc-3ce9-0a3f-ff3c-0000000008f5 12081 1726882437.97715: variable 'ansible_search_path' from source: unknown 12081 1726882437.97721: variable 'ansible_search_path' from source: unknown 12081 1726882437.97774: variable 'lsr_cleanup' from source: include params 12081 1726882437.97991: variable 'lsr_cleanup' from source: include params 12081 1726882437.98074: variable 'omit' from source: magic vars 12081 1726882437.98213: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.98226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.98241: variable 'omit' from source: magic vars 12081 1726882437.98489: variable 'ansible_distribution_major_version' from source: facts 12081 1726882437.98508: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882437.98518: variable 'item' from source: unknown 12081 1726882437.98586: variable 'item' from source: unknown 12081 1726882437.98629: variable 'item' from source: unknown 12081 1726882437.98692: variable 'item' from source: unknown 12081 1726882437.98888: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.98901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.98913: variable 'omit' from source: magic vars 12081 1726882437.99105: variable 'ansible_distribution_major_version' from source: facts 12081 1726882437.99117: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882437.99125: variable 'item' from source: unknown 12081 1726882437.99193: variable 'item' from source: unknown 12081 1726882437.99225: variable 'item' from source: unknown 12081 1726882437.99292: variable 'item' from source: unknown 12081 1726882437.99420: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882437.99433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882437.99446: variable 'omit' from source: magic vars 12081 1726882437.99600: variable 'ansible_distribution_major_version' from source: facts 12081 1726882437.99615: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882437.99623: variable 'item' from source: unknown 12081 1726882437.99687: variable 'item' from source: unknown 12081 1726882437.99726: variable 'item' from source: unknown 12081 1726882437.99789: variable 'item' from source: unknown 12081 1726882437.99872: dumping result to json 12081 1726882437.99881: done dumping result, returning 12081 1726882437.99890: done running TaskExecutor() for managed_node3/TASK: Cleanup [0e448fcc-3ce9-0a3f-ff3c-0000000008f5] 12081 1726882437.99904: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008f5 12081 1726882437.99993: no more pending results, returning what we have 12081 1726882437.99999: in VariableManager get_vars() 12081 1726882438.00051: Calling all_inventory to load vars for managed_node3 12081 1726882438.00053: Calling groups_inventory to load vars for managed_node3 12081 1726882438.00056: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882438.00074: Calling all_plugins_play to load vars for managed_node3 12081 1726882438.00077: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882438.00080: Calling groups_plugins_play to load vars for managed_node3 12081 1726882438.01293: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-0000000008f5 12081 1726882438.01297: WORKER PROCESS EXITING 12081 1726882438.01846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882438.04443: done with get_vars() 12081 1726882438.04477: variable 'ansible_search_path' from source: unknown 12081 1726882438.04479: variable 'ansible_search_path' from source: unknown 12081 1726882438.04523: variable 'ansible_search_path' from source: unknown 12081 1726882438.04525: variable 'ansible_search_path' from source: unknown 12081 1726882438.04559: variable 'ansible_search_path' from source: unknown 12081 1726882438.04560: variable 'ansible_search_path' from source: unknown 12081 1726882438.04591: we have included files to process 12081 1726882438.04592: generating all_blocks data 12081 1726882438.04594: done generating all_blocks data 12081 1726882438.04598: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12081 1726882438.04599: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12081 1726882438.04602: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12081 1726882438.04762: in VariableManager get_vars() 12081 1726882438.04791: done with get_vars() 12081 1726882438.04796: variable 'omit' from source: magic vars 12081 1726882438.04834: variable 'omit' from source: magic vars 12081 1726882438.04888: in VariableManager get_vars() 12081 1726882438.04906: done with get_vars() 12081 1726882438.04932: in VariableManager get_vars() 12081 1726882438.04952: done with get_vars() 12081 1726882438.04989: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12081 1726882438.05113: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12081 1726882438.05266: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12081 1726882438.05657: in VariableManager get_vars() 12081 1726882438.05683: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882438.07792: done processing included file 12081 1726882438.07794: iterating over new_blocks loaded from include file 12081 1726882438.07795: in VariableManager get_vars() 12081 1726882438.07820: done with get_vars() 12081 1726882438.07822: filtering new block on tags 12081 1726882438.08242: done filtering new block on tags 12081 1726882438.08246: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node3 => (item=tasks/cleanup_bond_profile+device.yml) 12081 1726882438.08252: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12081 1726882438.08253: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12081 1726882438.08257: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12081 1726882438.08616: done processing included file 12081 1726882438.08618: iterating over new_blocks loaded from include file 12081 1726882438.08620: in VariableManager get_vars() 12081 1726882438.08643: done with get_vars() 12081 1726882438.08644: filtering new block on tags 12081 1726882438.08677: done filtering new block on tags 12081 1726882438.08679: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 12081 1726882438.08703: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12081 1726882438.08710: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12081 1726882438.08713: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12081 1726882438.09050: done processing included file 12081 1726882438.09052: iterating over new_blocks loaded from include file 12081 1726882438.09053: in VariableManager get_vars() 12081 1726882438.09077: done with get_vars() 12081 1726882438.09079: filtering new block on tags 12081 1726882438.09107: done filtering new block on tags 12081 1726882438.09110: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 => (item=tasks/check_network_dns.yml) 12081 1726882438.09113: extending task lists for all hosts with included blocks 12081 1726882438.12405: done extending task lists 12081 1726882438.12407: done processing included files 12081 1726882438.12408: results queue empty 12081 1726882438.12409: checking for any_errors_fatal 12081 1726882438.12414: done checking for any_errors_fatal 12081 1726882438.12414: checking for max_fail_percentage 12081 1726882438.12416: done checking for max_fail_percentage 12081 1726882438.12417: checking to see if all hosts have failed and the running result is not ok 12081 1726882438.12418: done checking to see if all hosts have failed 12081 1726882438.12418: getting the remaining hosts for this loop 12081 1726882438.12420: done getting the remaining hosts for this loop 12081 1726882438.12423: getting the next task for host managed_node3 12081 1726882438.12428: done getting next task for host managed_node3 12081 1726882438.12430: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882438.12434: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882438.12446: getting variables 12081 1726882438.12448: in VariableManager get_vars() 12081 1726882438.12472: Calling all_inventory to load vars for managed_node3 12081 1726882438.12474: Calling groups_inventory to load vars for managed_node3 12081 1726882438.12477: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882438.12482: Calling all_plugins_play to load vars for managed_node3 12081 1726882438.12484: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882438.12487: Calling groups_plugins_play to load vars for managed_node3 12081 1726882438.13742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882438.15475: done with get_vars() 12081 1726882438.15504: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:58 -0400 (0:00:00.186) 0:00:57.958 ****** 12081 1726882438.15592: entering _queue_task() for managed_node3/include_tasks 12081 1726882438.15941: worker is 1 (out of 1 available) 12081 1726882438.15953: exiting _queue_task() for managed_node3/include_tasks 12081 1726882438.15967: done queuing things up, now waiting for results queue to drain 12081 1726882438.15968: waiting for pending results... 12081 1726882438.16290: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12081 1726882438.16459: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e0a 12081 1726882438.16484: variable 'ansible_search_path' from source: unknown 12081 1726882438.16492: variable 'ansible_search_path' from source: unknown 12081 1726882438.16537: calling self._execute() 12081 1726882438.16653: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882438.16669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882438.16686: variable 'omit' from source: magic vars 12081 1726882438.17079: variable 'ansible_distribution_major_version' from source: facts 12081 1726882438.17099: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882438.17110: _execute() done 12081 1726882438.17117: dumping result to json 12081 1726882438.17125: done dumping result, returning 12081 1726882438.17137: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-0a3f-ff3c-000000000e0a] 12081 1726882438.17150: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0a 12081 1726882438.17265: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0a 12081 1726882438.17273: WORKER PROCESS EXITING 12081 1726882438.17329: no more pending results, returning what we have 12081 1726882438.17335: in VariableManager get_vars() 12081 1726882438.17398: Calling all_inventory to load vars for managed_node3 12081 1726882438.17401: Calling groups_inventory to load vars for managed_node3 12081 1726882438.17404: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882438.17420: Calling all_plugins_play to load vars for managed_node3 12081 1726882438.17423: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882438.17426: Calling groups_plugins_play to load vars for managed_node3 12081 1726882438.19293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882438.20998: done with get_vars() 12081 1726882438.21025: variable 'ansible_search_path' from source: unknown 12081 1726882438.21027: variable 'ansible_search_path' from source: unknown 12081 1726882438.21073: we have included files to process 12081 1726882438.21074: generating all_blocks data 12081 1726882438.21075: done generating all_blocks data 12081 1726882438.21077: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882438.21079: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882438.21080: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12081 1726882438.21651: done processing included file 12081 1726882438.21653: iterating over new_blocks loaded from include file 12081 1726882438.21654: in VariableManager get_vars() 12081 1726882438.21689: done with get_vars() 12081 1726882438.21691: filtering new block on tags 12081 1726882438.21723: done filtering new block on tags 12081 1726882438.21726: in VariableManager get_vars() 12081 1726882438.21753: done with get_vars() 12081 1726882438.21755: filtering new block on tags 12081 1726882438.21805: done filtering new block on tags 12081 1726882438.21807: in VariableManager get_vars() 12081 1726882438.21835: done with get_vars() 12081 1726882438.21837: filtering new block on tags 12081 1726882438.21883: done filtering new block on tags 12081 1726882438.21885: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12081 1726882438.21890: extending task lists for all hosts with included blocks 12081 1726882438.23557: done extending task lists 12081 1726882438.23559: done processing included files 12081 1726882438.23560: results queue empty 12081 1726882438.23560: checking for any_errors_fatal 12081 1726882438.23567: done checking for any_errors_fatal 12081 1726882438.23568: checking for max_fail_percentage 12081 1726882438.23569: done checking for max_fail_percentage 12081 1726882438.23570: checking to see if all hosts have failed and the running result is not ok 12081 1726882438.23571: done checking to see if all hosts have failed 12081 1726882438.23572: getting the remaining hosts for this loop 12081 1726882438.23573: done getting the remaining hosts for this loop 12081 1726882438.23576: getting the next task for host managed_node3 12081 1726882438.23581: done getting next task for host managed_node3 12081 1726882438.23584: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882438.23588: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882438.23601: getting variables 12081 1726882438.23602: in VariableManager get_vars() 12081 1726882438.23624: Calling all_inventory to load vars for managed_node3 12081 1726882438.23626: Calling groups_inventory to load vars for managed_node3 12081 1726882438.23629: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882438.23634: Calling all_plugins_play to load vars for managed_node3 12081 1726882438.23637: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882438.23640: Calling groups_plugins_play to load vars for managed_node3 12081 1726882438.24890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882438.26622: done with get_vars() 12081 1726882438.26651: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:58 -0400 (0:00:00.111) 0:00:58.070 ****** 12081 1726882438.26736: entering _queue_task() for managed_node3/setup 12081 1726882438.27084: worker is 1 (out of 1 available) 12081 1726882438.27096: exiting _queue_task() for managed_node3/setup 12081 1726882438.27109: done queuing things up, now waiting for results queue to drain 12081 1726882438.27111: waiting for pending results... 12081 1726882438.27414: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12081 1726882438.27597: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000fde 12081 1726882438.27616: variable 'ansible_search_path' from source: unknown 12081 1726882438.27623: variable 'ansible_search_path' from source: unknown 12081 1726882438.27672: calling self._execute() 12081 1726882438.27766: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882438.27780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882438.27794: variable 'omit' from source: magic vars 12081 1726882438.28154: variable 'ansible_distribution_major_version' from source: facts 12081 1726882438.28175: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882438.28392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882438.30778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882438.30854: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882438.30901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882438.30944: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882438.30978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882438.31066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882438.31100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882438.31135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882438.31183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882438.31201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882438.31260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882438.31292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882438.31322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882438.31371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882438.31392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882438.31571: variable '__network_required_facts' from source: role '' defaults 12081 1726882438.31587: variable 'ansible_facts' from source: unknown 12081 1726882438.32172: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12081 1726882438.32176: when evaluation is False, skipping this task 12081 1726882438.32179: _execute() done 12081 1726882438.32182: dumping result to json 12081 1726882438.32184: done dumping result, returning 12081 1726882438.32190: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-0a3f-ff3c-000000000fde] 12081 1726882438.32196: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fde 12081 1726882438.32287: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fde 12081 1726882438.32290: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882438.32360: no more pending results, returning what we have 12081 1726882438.32367: results queue empty 12081 1726882438.32368: checking for any_errors_fatal 12081 1726882438.32369: done checking for any_errors_fatal 12081 1726882438.32370: checking for max_fail_percentage 12081 1726882438.32372: done checking for max_fail_percentage 12081 1726882438.32373: checking to see if all hosts have failed and the running result is not ok 12081 1726882438.32374: done checking to see if all hosts have failed 12081 1726882438.32374: getting the remaining hosts for this loop 12081 1726882438.32377: done getting the remaining hosts for this loop 12081 1726882438.32381: getting the next task for host managed_node3 12081 1726882438.32392: done getting next task for host managed_node3 12081 1726882438.32396: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882438.32404: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882438.32423: getting variables 12081 1726882438.32424: in VariableManager get_vars() 12081 1726882438.32469: Calling all_inventory to load vars for managed_node3 12081 1726882438.32472: Calling groups_inventory to load vars for managed_node3 12081 1726882438.32474: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882438.32483: Calling all_plugins_play to load vars for managed_node3 12081 1726882438.32486: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882438.32493: Calling groups_plugins_play to load vars for managed_node3 12081 1726882438.33324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882438.34915: done with get_vars() 12081 1726882438.34947: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:58 -0400 (0:00:00.082) 0:00:58.153 ****** 12081 1726882438.35024: entering _queue_task() for managed_node3/stat 12081 1726882438.35276: worker is 1 (out of 1 available) 12081 1726882438.35290: exiting _queue_task() for managed_node3/stat 12081 1726882438.35303: done queuing things up, now waiting for results queue to drain 12081 1726882438.35305: waiting for pending results... 12081 1726882438.35506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12081 1726882438.35616: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000fe0 12081 1726882438.35627: variable 'ansible_search_path' from source: unknown 12081 1726882438.35631: variable 'ansible_search_path' from source: unknown 12081 1726882438.35662: calling self._execute() 12081 1726882438.35733: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882438.35737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882438.35745: variable 'omit' from source: magic vars 12081 1726882438.36011: variable 'ansible_distribution_major_version' from source: facts 12081 1726882438.36023: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882438.36142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882438.36337: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882438.36372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882438.36397: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882438.36423: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882438.36489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882438.36506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882438.36525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882438.36543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882438.36610: variable '__network_is_ostree' from source: set_fact 12081 1726882438.36614: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882438.36618: when evaluation is False, skipping this task 12081 1726882438.36621: _execute() done 12081 1726882438.36623: dumping result to json 12081 1726882438.36626: done dumping result, returning 12081 1726882438.36633: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-0a3f-ff3c-000000000fe0] 12081 1726882438.36638: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe0 12081 1726882438.36722: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe0 12081 1726882438.36725: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882438.36816: no more pending results, returning what we have 12081 1726882438.36821: results queue empty 12081 1726882438.36822: checking for any_errors_fatal 12081 1726882438.36830: done checking for any_errors_fatal 12081 1726882438.36831: checking for max_fail_percentage 12081 1726882438.36832: done checking for max_fail_percentage 12081 1726882438.36840: checking to see if all hosts have failed and the running result is not ok 12081 1726882438.36841: done checking to see if all hosts have failed 12081 1726882438.36845: getting the remaining hosts for this loop 12081 1726882438.36951: done getting the remaining hosts for this loop 12081 1726882438.36956: getting the next task for host managed_node3 12081 1726882438.36965: done getting next task for host managed_node3 12081 1726882438.36969: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882438.36975: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882438.36995: getting variables 12081 1726882438.36997: in VariableManager get_vars() 12081 1726882438.37038: Calling all_inventory to load vars for managed_node3 12081 1726882438.37041: Calling groups_inventory to load vars for managed_node3 12081 1726882438.37044: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882438.37052: Calling all_plugins_play to load vars for managed_node3 12081 1726882438.37055: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882438.37058: Calling groups_plugins_play to load vars for managed_node3 12081 1726882438.38605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882438.40308: done with get_vars() 12081 1726882438.40334: done getting variables 12081 1726882438.40400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:58 -0400 (0:00:00.054) 0:00:58.207 ****** 12081 1726882438.40440: entering _queue_task() for managed_node3/set_fact 12081 1726882438.40775: worker is 1 (out of 1 available) 12081 1726882438.40785: exiting _queue_task() for managed_node3/set_fact 12081 1726882438.40797: done queuing things up, now waiting for results queue to drain 12081 1726882438.40799: waiting for pending results... 12081 1726882438.41104: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12081 1726882438.41270: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000fe1 12081 1726882438.41286: variable 'ansible_search_path' from source: unknown 12081 1726882438.41290: variable 'ansible_search_path' from source: unknown 12081 1726882438.41325: calling self._execute() 12081 1726882438.41414: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882438.41419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882438.41429: variable 'omit' from source: magic vars 12081 1726882438.41782: variable 'ansible_distribution_major_version' from source: facts 12081 1726882438.41800: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882438.41962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882438.42243: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882438.42289: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882438.42322: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882438.42359: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882438.42445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882438.42470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882438.42498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882438.42521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882438.42615: variable '__network_is_ostree' from source: set_fact 12081 1726882438.42621: Evaluated conditional (not __network_is_ostree is defined): False 12081 1726882438.42624: when evaluation is False, skipping this task 12081 1726882438.42627: _execute() done 12081 1726882438.42629: dumping result to json 12081 1726882438.42632: done dumping result, returning 12081 1726882438.42642: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-0a3f-ff3c-000000000fe1] 12081 1726882438.42648: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe1 12081 1726882438.42750: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe1 12081 1726882438.42756: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12081 1726882438.42811: no more pending results, returning what we have 12081 1726882438.42816: results queue empty 12081 1726882438.42817: checking for any_errors_fatal 12081 1726882438.42827: done checking for any_errors_fatal 12081 1726882438.42828: checking for max_fail_percentage 12081 1726882438.42829: done checking for max_fail_percentage 12081 1726882438.42831: checking to see if all hosts have failed and the running result is not ok 12081 1726882438.42832: done checking to see if all hosts have failed 12081 1726882438.42833: getting the remaining hosts for this loop 12081 1726882438.42835: done getting the remaining hosts for this loop 12081 1726882438.42839: getting the next task for host managed_node3 12081 1726882438.42851: done getting next task for host managed_node3 12081 1726882438.42855: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882438.42863: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882438.42890: getting variables 12081 1726882438.42892: in VariableManager get_vars() 12081 1726882438.42940: Calling all_inventory to load vars for managed_node3 12081 1726882438.42943: Calling groups_inventory to load vars for managed_node3 12081 1726882438.42946: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882438.42957: Calling all_plugins_play to load vars for managed_node3 12081 1726882438.42961: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882438.42966: Calling groups_plugins_play to load vars for managed_node3 12081 1726882438.45701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882438.48034: done with get_vars() 12081 1726882438.48069: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:58 -0400 (0:00:00.080) 0:00:58.288 ****** 12081 1726882438.48529: entering _queue_task() for managed_node3/service_facts 12081 1726882438.48923: worker is 1 (out of 1 available) 12081 1726882438.48935: exiting _queue_task() for managed_node3/service_facts 12081 1726882438.48950: done queuing things up, now waiting for results queue to drain 12081 1726882438.48951: waiting for pending results... 12081 1726882438.49263: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12081 1726882438.49417: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000fe3 12081 1726882438.49428: variable 'ansible_search_path' from source: unknown 12081 1726882438.49432: variable 'ansible_search_path' from source: unknown 12081 1726882438.49468: calling self._execute() 12081 1726882438.49558: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882438.49561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882438.49572: variable 'omit' from source: magic vars 12081 1726882438.49925: variable 'ansible_distribution_major_version' from source: facts 12081 1726882438.49943: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882438.49949: variable 'omit' from source: magic vars 12081 1726882438.50041: variable 'omit' from source: magic vars 12081 1726882438.50078: variable 'omit' from source: magic vars 12081 1726882438.50120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882438.50158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882438.50181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882438.50198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882438.50209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882438.50240: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882438.50243: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882438.50246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882438.50357: Set connection var ansible_pipelining to False 12081 1726882438.50360: Set connection var ansible_shell_type to sh 12081 1726882438.50365: Set connection var ansible_shell_executable to /bin/sh 12081 1726882438.50373: Set connection var ansible_connection to ssh 12081 1726882438.50380: Set connection var ansible_timeout to 10 12081 1726882438.50385: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882438.50411: variable 'ansible_shell_executable' from source: unknown 12081 1726882438.50414: variable 'ansible_connection' from source: unknown 12081 1726882438.50418: variable 'ansible_module_compression' from source: unknown 12081 1726882438.50421: variable 'ansible_shell_type' from source: unknown 12081 1726882438.50423: variable 'ansible_shell_executable' from source: unknown 12081 1726882438.50426: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882438.50428: variable 'ansible_pipelining' from source: unknown 12081 1726882438.50431: variable 'ansible_timeout' from source: unknown 12081 1726882438.50433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882438.51507: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882438.51635: variable 'omit' from source: magic vars 12081 1726882438.51638: starting attempt loop 12081 1726882438.51641: running the handler 12081 1726882438.51658: _low_level_execute_command(): starting 12081 1726882438.51668: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882438.54009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.54019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.54194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.54199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.54213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.54220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.54337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882438.54396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882438.54400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882438.54528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882438.56249: stdout chunk (state=3): >>>/root <<< 12081 1726882438.56383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882438.56495: stderr chunk (state=3): >>><<< 12081 1726882438.56502: stdout chunk (state=3): >>><<< 12081 1726882438.56522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882438.56539: _low_level_execute_command(): starting 12081 1726882438.56546: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005 `" && echo ansible-tmp-1726882438.5652368-14712-184276479820005="` echo /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005 `" ) && sleep 0' 12081 1726882438.58170: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882438.58174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.58178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.58181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.58191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.58193: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882438.58196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.58220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882438.58223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882438.58225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882438.58228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.58276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.58280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.58282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.58285: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882438.58287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.58357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882438.58380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882438.58393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882438.58524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882438.60650: stdout chunk (state=3): >>>ansible-tmp-1726882438.5652368-14712-184276479820005=/root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005 <<< 12081 1726882438.60761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882438.60812: stderr chunk (state=3): >>><<< 12081 1726882438.60816: stdout chunk (state=3): >>><<< 12081 1726882438.60830: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882438.5652368-14712-184276479820005=/root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882438.60872: variable 'ansible_module_compression' from source: unknown 12081 1726882438.60907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12081 1726882438.60937: variable 'ansible_facts' from source: unknown 12081 1726882438.61055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005/AnsiballZ_service_facts.py 12081 1726882438.61208: Sending initial data 12081 1726882438.61212: Sent initial data (162 bytes) 12081 1726882438.62176: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882438.62286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.62290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.62304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.62340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.62344: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882438.62367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.62370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882438.62378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882438.62381: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882438.62400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.62403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.62406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.62414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.62421: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882438.62431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.62510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882438.62618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882438.62636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882438.62764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882438.64765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882438.64857: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882438.64955: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpzbtjn0w5 /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005/AnsiballZ_service_facts.py <<< 12081 1726882438.65052: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882438.66445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882438.66524: stderr chunk (state=3): >>><<< 12081 1726882438.66528: stdout chunk (state=3): >>><<< 12081 1726882438.66545: done transferring module to remote 12081 1726882438.66559: _low_level_execute_command(): starting 12081 1726882438.66562: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005/ /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005/AnsiballZ_service_facts.py && sleep 0' 12081 1726882438.67159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882438.67169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.67177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.67190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.67225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.67234: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882438.67243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.67257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882438.67260: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882438.67273: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882438.67281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.67303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.67305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.67308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.67314: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882438.67333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.67406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882438.67409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882438.67422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882438.67528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882438.69262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882438.69343: stderr chunk (state=3): >>><<< 12081 1726882438.69347: stdout chunk (state=3): >>><<< 12081 1726882438.69367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882438.69370: _low_level_execute_command(): starting 12081 1726882438.69375: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005/AnsiballZ_service_facts.py && sleep 0' 12081 1726882438.70020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882438.70034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.70053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.70077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.70124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.70137: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882438.70151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.70185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882438.70200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882438.70212: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882438.70223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882438.70235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882438.70249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882438.70260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882438.70282: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882438.70299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882438.70385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882438.70413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882438.70427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882438.70568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.03353: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source<<< 12081 1726882440.03378: stdout chunk (state=3): >>>": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servic<<< 12081 1726882440.03412: stdout chunk (state=3): >>>e": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "stat<<< 12081 1726882440.03418: stdout chunk (state=3): >>>ic", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12081 1726882440.04738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882440.05369: stdout chunk (state=3): >>><<< 12081 1726882440.05373: stderr chunk (state=3): >>><<< 12081 1726882440.05380: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882440.05830: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882440.05848: _low_level_execute_command(): starting 12081 1726882440.05861: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882438.5652368-14712-184276479820005/ > /dev/null 2>&1 && sleep 0' 12081 1726882440.06539: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882440.06555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.06573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.06593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.06637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.06650: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882440.06671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.06689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882440.06701: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882440.06714: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882440.06726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.06738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.06752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.06770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.06781: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882440.06794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.06869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882440.06887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882440.06902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882440.07042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.08908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882440.08912: stdout chunk (state=3): >>><<< 12081 1726882440.08920: stderr chunk (state=3): >>><<< 12081 1726882440.08936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882440.08943: handler run complete 12081 1726882440.09130: variable 'ansible_facts' from source: unknown 12081 1726882440.09363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882440.09795: variable 'ansible_facts' from source: unknown 12081 1726882440.09921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882440.10108: attempt loop complete, returning result 12081 1726882440.10111: _execute() done 12081 1726882440.10114: dumping result to json 12081 1726882440.10172: done dumping result, returning 12081 1726882440.10182: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-0a3f-ff3c-000000000fe3] 12081 1726882440.10189: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe3 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882440.11924: no more pending results, returning what we have 12081 1726882440.11927: results queue empty 12081 1726882440.11928: checking for any_errors_fatal 12081 1726882440.11931: done checking for any_errors_fatal 12081 1726882440.11932: checking for max_fail_percentage 12081 1726882440.11933: done checking for max_fail_percentage 12081 1726882440.11934: checking to see if all hosts have failed and the running result is not ok 12081 1726882440.11935: done checking to see if all hosts have failed 12081 1726882440.11935: getting the remaining hosts for this loop 12081 1726882440.11937: done getting the remaining hosts for this loop 12081 1726882440.11940: getting the next task for host managed_node3 12081 1726882440.11946: done getting next task for host managed_node3 12081 1726882440.11948: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882440.11956: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882440.11972: getting variables 12081 1726882440.11973: in VariableManager get_vars() 12081 1726882440.12013: Calling all_inventory to load vars for managed_node3 12081 1726882440.12016: Calling groups_inventory to load vars for managed_node3 12081 1726882440.12019: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882440.12029: Calling all_plugins_play to load vars for managed_node3 12081 1726882440.12032: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882440.12035: Calling groups_plugins_play to load vars for managed_node3 12081 1726882440.12895: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe3 12081 1726882440.12899: WORKER PROCESS EXITING 12081 1726882440.13985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882440.15821: done with get_vars() 12081 1726882440.15849: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:00 -0400 (0:00:01.674) 0:00:59.962 ****** 12081 1726882440.15956: entering _queue_task() for managed_node3/package_facts 12081 1726882440.16305: worker is 1 (out of 1 available) 12081 1726882440.16316: exiting _queue_task() for managed_node3/package_facts 12081 1726882440.16331: done queuing things up, now waiting for results queue to drain 12081 1726882440.16333: waiting for pending results... 12081 1726882440.16632: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12081 1726882440.16801: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000fe4 12081 1726882440.16819: variable 'ansible_search_path' from source: unknown 12081 1726882440.16824: variable 'ansible_search_path' from source: unknown 12081 1726882440.16861: calling self._execute() 12081 1726882440.16956: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882440.16968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882440.16987: variable 'omit' from source: magic vars 12081 1726882440.17339: variable 'ansible_distribution_major_version' from source: facts 12081 1726882440.17355: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882440.17367: variable 'omit' from source: magic vars 12081 1726882440.17467: variable 'omit' from source: magic vars 12081 1726882440.17503: variable 'omit' from source: magic vars 12081 1726882440.17556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882440.17598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882440.17622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882440.17653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882440.17671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882440.17706: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882440.17715: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882440.17723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882440.17833: Set connection var ansible_pipelining to False 12081 1726882440.17844: Set connection var ansible_shell_type to sh 12081 1726882440.17865: Set connection var ansible_shell_executable to /bin/sh 12081 1726882440.17874: Set connection var ansible_connection to ssh 12081 1726882440.17884: Set connection var ansible_timeout to 10 12081 1726882440.17894: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882440.17921: variable 'ansible_shell_executable' from source: unknown 12081 1726882440.17929: variable 'ansible_connection' from source: unknown 12081 1726882440.17937: variable 'ansible_module_compression' from source: unknown 12081 1726882440.17944: variable 'ansible_shell_type' from source: unknown 12081 1726882440.17951: variable 'ansible_shell_executable' from source: unknown 12081 1726882440.17962: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882440.17976: variable 'ansible_pipelining' from source: unknown 12081 1726882440.17983: variable 'ansible_timeout' from source: unknown 12081 1726882440.17991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882440.18198: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882440.18215: variable 'omit' from source: magic vars 12081 1726882440.18225: starting attempt loop 12081 1726882440.18232: running the handler 12081 1726882440.18250: _low_level_execute_command(): starting 12081 1726882440.18265: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882440.19045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882440.19068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.19085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.19105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.19148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.19166: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882440.19185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.19205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882440.19218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882440.19231: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882440.19244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.19259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.19282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.19301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.19313: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882440.19329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.19412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882440.19428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882440.19443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882440.19592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.21170: stdout chunk (state=3): >>>/root <<< 12081 1726882440.21276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882440.21351: stderr chunk (state=3): >>><<< 12081 1726882440.21361: stdout chunk (state=3): >>><<< 12081 1726882440.21393: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882440.21405: _low_level_execute_command(): starting 12081 1726882440.21411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175 `" && echo ansible-tmp-1726882440.2139163-14777-10203361960175="` echo /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175 `" ) && sleep 0' 12081 1726882440.22128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882440.22139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.22156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.22175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.22212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.22219: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882440.22229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.22243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882440.22253: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882440.22271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882440.22279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.22289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.22302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.22309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.22316: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882440.22325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.22423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882440.22440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882440.22452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882440.22607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.24455: stdout chunk (state=3): >>>ansible-tmp-1726882440.2139163-14777-10203361960175=/root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175 <<< 12081 1726882440.24569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882440.24613: stderr chunk (state=3): >>><<< 12081 1726882440.24616: stdout chunk (state=3): >>><<< 12081 1726882440.24630: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882440.2139163-14777-10203361960175=/root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882440.24672: variable 'ansible_module_compression' from source: unknown 12081 1726882440.24711: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12081 1726882440.24753: variable 'ansible_facts' from source: unknown 12081 1726882440.24886: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175/AnsiballZ_package_facts.py 12081 1726882440.25002: Sending initial data 12081 1726882440.25005: Sent initial data (161 bytes) 12081 1726882440.25657: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882440.25673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.25678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.25689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.25717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.25724: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882440.25731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.25743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882440.25750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882440.25755: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.25773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.25779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882440.25784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.25830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882440.25862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882440.25867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882440.25976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.27688: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882440.27783: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882440.27884: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp62laceev /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175/AnsiballZ_package_facts.py <<< 12081 1726882440.27982: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882440.30338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882440.30422: stderr chunk (state=3): >>><<< 12081 1726882440.30425: stdout chunk (state=3): >>><<< 12081 1726882440.30444: done transferring module to remote 12081 1726882440.30458: _low_level_execute_command(): starting 12081 1726882440.30461: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175/ /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175/AnsiballZ_package_facts.py && sleep 0' 12081 1726882440.30966: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.30970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.31110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882440.31125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882440.31247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.33375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882440.33440: stderr chunk (state=3): >>><<< 12081 1726882440.33443: stdout chunk (state=3): >>><<< 12081 1726882440.33461: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882440.33474: _low_level_execute_command(): starting 12081 1726882440.33482: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175/AnsiballZ_package_facts.py && sleep 0' 12081 1726882440.34545: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882440.34685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.34699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.34721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.34774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.34847: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.34851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 12081 1726882440.34855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.34857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.34861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.34920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882440.34923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882440.34932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882440.35048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.81837: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 12081 1726882440.81858: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 12081 1726882440.81921: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 12081 1726882440.81959: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 12081 1726882440.81996: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 12081 1726882440.82020: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 12081 1726882440.82030: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12081 1726882440.83569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882440.83647: stderr chunk (state=3): >>><<< 12081 1726882440.83649: stdout chunk (state=3): >>><<< 12081 1726882440.83782: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882440.86504: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882440.86534: _low_level_execute_command(): starting 12081 1726882440.86543: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882440.2139163-14777-10203361960175/ > /dev/null 2>&1 && sleep 0' 12081 1726882440.87233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882440.87246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.87259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.87283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.87325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.87336: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882440.87348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.87368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882440.87384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882440.87395: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882440.87405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882440.87417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882440.87430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882440.87441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882440.87450: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882440.87464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882440.87546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882440.87570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882440.87586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882440.87728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882440.89648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882440.89651: stdout chunk (state=3): >>><<< 12081 1726882440.89654: stderr chunk (state=3): >>><<< 12081 1726882440.90269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882440.90273: handler run complete 12081 1726882440.90642: variable 'ansible_facts' from source: unknown 12081 1726882440.91176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882440.93220: variable 'ansible_facts' from source: unknown 12081 1726882440.93689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882440.94482: attempt loop complete, returning result 12081 1726882440.94500: _execute() done 12081 1726882440.94507: dumping result to json 12081 1726882440.94742: done dumping result, returning 12081 1726882440.94756: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-0a3f-ff3c-000000000fe4] 12081 1726882440.94769: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe4 12081 1726882441.02130: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000fe4 12081 1726882441.02133: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882441.02298: no more pending results, returning what we have 12081 1726882441.02301: results queue empty 12081 1726882441.02302: checking for any_errors_fatal 12081 1726882441.02308: done checking for any_errors_fatal 12081 1726882441.02308: checking for max_fail_percentage 12081 1726882441.02310: done checking for max_fail_percentage 12081 1726882441.02311: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.02312: done checking to see if all hosts have failed 12081 1726882441.02313: getting the remaining hosts for this loop 12081 1726882441.02314: done getting the remaining hosts for this loop 12081 1726882441.02318: getting the next task for host managed_node3 12081 1726882441.02325: done getting next task for host managed_node3 12081 1726882441.02329: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882441.02334: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.02347: getting variables 12081 1726882441.02349: in VariableManager get_vars() 12081 1726882441.02390: Calling all_inventory to load vars for managed_node3 12081 1726882441.02393: Calling groups_inventory to load vars for managed_node3 12081 1726882441.02399: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.02409: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.02411: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.02414: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.03524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.04480: done with get_vars() 12081 1726882441.04499: done getting variables 12081 1726882441.04543: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:01 -0400 (0:00:00.886) 0:01:00.848 ****** 12081 1726882441.04573: entering _queue_task() for managed_node3/debug 12081 1726882441.04833: worker is 1 (out of 1 available) 12081 1726882441.04846: exiting _queue_task() for managed_node3/debug 12081 1726882441.04865: done queuing things up, now waiting for results queue to drain 12081 1726882441.04866: waiting for pending results... 12081 1726882441.05112: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12081 1726882441.05293: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e0b 12081 1726882441.05318: variable 'ansible_search_path' from source: unknown 12081 1726882441.05327: variable 'ansible_search_path' from source: unknown 12081 1726882441.05373: calling self._execute() 12081 1726882441.05469: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.05484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.05498: variable 'omit' from source: magic vars 12081 1726882441.05886: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.05908: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.05922: variable 'omit' from source: magic vars 12081 1726882441.06003: variable 'omit' from source: magic vars 12081 1726882441.06118: variable 'network_provider' from source: set_fact 12081 1726882441.06152: variable 'omit' from source: magic vars 12081 1726882441.06189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882441.06214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882441.06253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882441.06260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882441.06278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882441.06306: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882441.06309: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.06312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.06386: Set connection var ansible_pipelining to False 12081 1726882441.06391: Set connection var ansible_shell_type to sh 12081 1726882441.06396: Set connection var ansible_shell_executable to /bin/sh 12081 1726882441.06399: Set connection var ansible_connection to ssh 12081 1726882441.06404: Set connection var ansible_timeout to 10 12081 1726882441.06409: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882441.06435: variable 'ansible_shell_executable' from source: unknown 12081 1726882441.06440: variable 'ansible_connection' from source: unknown 12081 1726882441.06442: variable 'ansible_module_compression' from source: unknown 12081 1726882441.06445: variable 'ansible_shell_type' from source: unknown 12081 1726882441.06449: variable 'ansible_shell_executable' from source: unknown 12081 1726882441.06452: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.06454: variable 'ansible_pipelining' from source: unknown 12081 1726882441.06455: variable 'ansible_timeout' from source: unknown 12081 1726882441.06460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.06560: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882441.06571: variable 'omit' from source: magic vars 12081 1726882441.06580: starting attempt loop 12081 1726882441.06583: running the handler 12081 1726882441.06617: handler run complete 12081 1726882441.06628: attempt loop complete, returning result 12081 1726882441.06631: _execute() done 12081 1726882441.06633: dumping result to json 12081 1726882441.06636: done dumping result, returning 12081 1726882441.06643: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-0a3f-ff3c-000000000e0b] 12081 1726882441.06649: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0b 12081 1726882441.06734: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0b 12081 1726882441.06737: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 12081 1726882441.06806: no more pending results, returning what we have 12081 1726882441.06810: results queue empty 12081 1726882441.06811: checking for any_errors_fatal 12081 1726882441.06822: done checking for any_errors_fatal 12081 1726882441.06823: checking for max_fail_percentage 12081 1726882441.06824: done checking for max_fail_percentage 12081 1726882441.06825: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.06826: done checking to see if all hosts have failed 12081 1726882441.06827: getting the remaining hosts for this loop 12081 1726882441.06828: done getting the remaining hosts for this loop 12081 1726882441.06832: getting the next task for host managed_node3 12081 1726882441.06839: done getting next task for host managed_node3 12081 1726882441.06842: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882441.06847: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.06859: getting variables 12081 1726882441.06861: in VariableManager get_vars() 12081 1726882441.06901: Calling all_inventory to load vars for managed_node3 12081 1726882441.06904: Calling groups_inventory to load vars for managed_node3 12081 1726882441.06906: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.06914: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.06916: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.06919: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.12422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.13956: done with get_vars() 12081 1726882441.13981: done getting variables 12081 1726882441.14027: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:01 -0400 (0:00:00.094) 0:01:00.943 ****** 12081 1726882441.14075: entering _queue_task() for managed_node3/fail 12081 1726882441.14435: worker is 1 (out of 1 available) 12081 1726882441.14446: exiting _queue_task() for managed_node3/fail 12081 1726882441.14461: done queuing things up, now waiting for results queue to drain 12081 1726882441.14465: waiting for pending results... 12081 1726882441.15518: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12081 1726882441.15682: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e0c 12081 1726882441.15694: variable 'ansible_search_path' from source: unknown 12081 1726882441.15698: variable 'ansible_search_path' from source: unknown 12081 1726882441.15744: calling self._execute() 12081 1726882441.15840: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.15845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.15858: variable 'omit' from source: magic vars 12081 1726882441.16395: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.16408: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.16571: variable 'network_state' from source: role '' defaults 12081 1726882441.16587: Evaluated conditional (network_state != {}): False 12081 1726882441.16594: when evaluation is False, skipping this task 12081 1726882441.16601: _execute() done 12081 1726882441.16609: dumping result to json 12081 1726882441.16625: done dumping result, returning 12081 1726882441.16638: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-0a3f-ff3c-000000000e0c] 12081 1726882441.16651: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882441.16818: no more pending results, returning what we have 12081 1726882441.16823: results queue empty 12081 1726882441.16824: checking for any_errors_fatal 12081 1726882441.16835: done checking for any_errors_fatal 12081 1726882441.16835: checking for max_fail_percentage 12081 1726882441.16837: done checking for max_fail_percentage 12081 1726882441.16838: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.16840: done checking to see if all hosts have failed 12081 1726882441.16840: getting the remaining hosts for this loop 12081 1726882441.16842: done getting the remaining hosts for this loop 12081 1726882441.16846: getting the next task for host managed_node3 12081 1726882441.16858: done getting next task for host managed_node3 12081 1726882441.16867: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882441.16873: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.16904: getting variables 12081 1726882441.16906: in VariableManager get_vars() 12081 1726882441.16961: Calling all_inventory to load vars for managed_node3 12081 1726882441.16966: Calling groups_inventory to load vars for managed_node3 12081 1726882441.16969: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.16982: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.16985: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.16988: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.18218: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0c 12081 1726882441.18222: WORKER PROCESS EXITING 12081 1726882441.18790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.19847: done with get_vars() 12081 1726882441.19867: done getting variables 12081 1726882441.19909: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:01 -0400 (0:00:00.058) 0:01:01.002 ****** 12081 1726882441.19934: entering _queue_task() for managed_node3/fail 12081 1726882441.20153: worker is 1 (out of 1 available) 12081 1726882441.20170: exiting _queue_task() for managed_node3/fail 12081 1726882441.20184: done queuing things up, now waiting for results queue to drain 12081 1726882441.20185: waiting for pending results... 12081 1726882441.20379: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12081 1726882441.20475: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e0d 12081 1726882441.20489: variable 'ansible_search_path' from source: unknown 12081 1726882441.20494: variable 'ansible_search_path' from source: unknown 12081 1726882441.20526: calling self._execute() 12081 1726882441.20596: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.20600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.20610: variable 'omit' from source: magic vars 12081 1726882441.20885: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.20896: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.20980: variable 'network_state' from source: role '' defaults 12081 1726882441.20987: Evaluated conditional (network_state != {}): False 12081 1726882441.20991: when evaluation is False, skipping this task 12081 1726882441.20993: _execute() done 12081 1726882441.20996: dumping result to json 12081 1726882441.20998: done dumping result, returning 12081 1726882441.21005: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-0a3f-ff3c-000000000e0d] 12081 1726882441.21012: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0d 12081 1726882441.21110: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0d 12081 1726882441.21113: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882441.21159: no more pending results, returning what we have 12081 1726882441.21162: results queue empty 12081 1726882441.21165: checking for any_errors_fatal 12081 1726882441.21172: done checking for any_errors_fatal 12081 1726882441.21173: checking for max_fail_percentage 12081 1726882441.21174: done checking for max_fail_percentage 12081 1726882441.21175: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.21176: done checking to see if all hosts have failed 12081 1726882441.21177: getting the remaining hosts for this loop 12081 1726882441.21178: done getting the remaining hosts for this loop 12081 1726882441.21188: getting the next task for host managed_node3 12081 1726882441.21195: done getting next task for host managed_node3 12081 1726882441.21199: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882441.21204: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.21223: getting variables 12081 1726882441.21224: in VariableManager get_vars() 12081 1726882441.21265: Calling all_inventory to load vars for managed_node3 12081 1726882441.21268: Calling groups_inventory to load vars for managed_node3 12081 1726882441.21270: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.21276: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.21278: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.21280: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.22072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.23028: done with get_vars() 12081 1726882441.23042: done getting variables 12081 1726882441.23087: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:01 -0400 (0:00:00.031) 0:01:01.034 ****** 12081 1726882441.23112: entering _queue_task() for managed_node3/fail 12081 1726882441.23316: worker is 1 (out of 1 available) 12081 1726882441.23329: exiting _queue_task() for managed_node3/fail 12081 1726882441.23342: done queuing things up, now waiting for results queue to drain 12081 1726882441.23343: waiting for pending results... 12081 1726882441.23535: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12081 1726882441.23644: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e0e 12081 1726882441.23655: variable 'ansible_search_path' from source: unknown 12081 1726882441.23661: variable 'ansible_search_path' from source: unknown 12081 1726882441.23696: calling self._execute() 12081 1726882441.23771: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.23775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.23784: variable 'omit' from source: magic vars 12081 1726882441.24053: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.24067: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.24189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882441.25818: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882441.26126: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882441.26153: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882441.26183: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882441.26203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882441.26261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.26284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.26304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.26330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.26341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.26409: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.26422: Evaluated conditional (ansible_distribution_major_version | int > 9): False 12081 1726882441.26426: when evaluation is False, skipping this task 12081 1726882441.26429: _execute() done 12081 1726882441.26431: dumping result to json 12081 1726882441.26434: done dumping result, returning 12081 1726882441.26440: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-0a3f-ff3c-000000000e0e] 12081 1726882441.26446: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0e 12081 1726882441.26537: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0e 12081 1726882441.26539: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 12081 1726882441.26589: no more pending results, returning what we have 12081 1726882441.26593: results queue empty 12081 1726882441.26594: checking for any_errors_fatal 12081 1726882441.26601: done checking for any_errors_fatal 12081 1726882441.26601: checking for max_fail_percentage 12081 1726882441.26603: done checking for max_fail_percentage 12081 1726882441.26604: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.26605: done checking to see if all hosts have failed 12081 1726882441.26606: getting the remaining hosts for this loop 12081 1726882441.26608: done getting the remaining hosts for this loop 12081 1726882441.26611: getting the next task for host managed_node3 12081 1726882441.26619: done getting next task for host managed_node3 12081 1726882441.26623: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882441.26629: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.26657: getting variables 12081 1726882441.26659: in VariableManager get_vars() 12081 1726882441.26701: Calling all_inventory to load vars for managed_node3 12081 1726882441.26704: Calling groups_inventory to load vars for managed_node3 12081 1726882441.26706: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.26716: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.26719: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.26721: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.27695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.28640: done with get_vars() 12081 1726882441.28661: done getting variables 12081 1726882441.28709: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:01 -0400 (0:00:00.056) 0:01:01.090 ****** 12081 1726882441.28736: entering _queue_task() for managed_node3/dnf 12081 1726882441.28985: worker is 1 (out of 1 available) 12081 1726882441.28998: exiting _queue_task() for managed_node3/dnf 12081 1726882441.29012: done queuing things up, now waiting for results queue to drain 12081 1726882441.29014: waiting for pending results... 12081 1726882441.29215: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12081 1726882441.29313: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e0f 12081 1726882441.29323: variable 'ansible_search_path' from source: unknown 12081 1726882441.29327: variable 'ansible_search_path' from source: unknown 12081 1726882441.29367: calling self._execute() 12081 1726882441.29437: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.29440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.29448: variable 'omit' from source: magic vars 12081 1726882441.29737: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.29747: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.29891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882441.31546: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882441.31603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882441.31634: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882441.31660: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882441.31682: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882441.31743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.31765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.31784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.31810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.31820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.31907: variable 'ansible_distribution' from source: facts 12081 1726882441.31910: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.31923: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12081 1726882441.32004: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882441.32092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.32109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.32125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.32150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.32164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.32195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.32210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.32227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.32251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.32267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.32296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.32312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.32328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.32352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.32366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.32470: variable 'network_connections' from source: task vars 12081 1726882441.32480: variable 'port2_profile' from source: play vars 12081 1726882441.32526: variable 'port2_profile' from source: play vars 12081 1726882441.32534: variable 'port1_profile' from source: play vars 12081 1726882441.32579: variable 'port1_profile' from source: play vars 12081 1726882441.32586: variable 'controller_profile' from source: play vars 12081 1726882441.32631: variable 'controller_profile' from source: play vars 12081 1726882441.32684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882441.32803: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882441.32831: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882441.32855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882441.32880: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882441.32910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882441.32935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882441.32950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.32971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882441.33009: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882441.33165: variable 'network_connections' from source: task vars 12081 1726882441.33169: variable 'port2_profile' from source: play vars 12081 1726882441.33210: variable 'port2_profile' from source: play vars 12081 1726882441.33216: variable 'port1_profile' from source: play vars 12081 1726882441.33264: variable 'port1_profile' from source: play vars 12081 1726882441.33270: variable 'controller_profile' from source: play vars 12081 1726882441.33311: variable 'controller_profile' from source: play vars 12081 1726882441.33329: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882441.33332: when evaluation is False, skipping this task 12081 1726882441.33334: _execute() done 12081 1726882441.33336: dumping result to json 12081 1726882441.33339: done dumping result, returning 12081 1726882441.33346: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000e0f] 12081 1726882441.33352: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0f 12081 1726882441.33444: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e0f 12081 1726882441.33447: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882441.33515: no more pending results, returning what we have 12081 1726882441.33519: results queue empty 12081 1726882441.33520: checking for any_errors_fatal 12081 1726882441.33527: done checking for any_errors_fatal 12081 1726882441.33528: checking for max_fail_percentage 12081 1726882441.33530: done checking for max_fail_percentage 12081 1726882441.33531: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.33532: done checking to see if all hosts have failed 12081 1726882441.33533: getting the remaining hosts for this loop 12081 1726882441.33535: done getting the remaining hosts for this loop 12081 1726882441.33538: getting the next task for host managed_node3 12081 1726882441.33547: done getting next task for host managed_node3 12081 1726882441.33552: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882441.33557: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.33586: getting variables 12081 1726882441.33588: in VariableManager get_vars() 12081 1726882441.33630: Calling all_inventory to load vars for managed_node3 12081 1726882441.33632: Calling groups_inventory to load vars for managed_node3 12081 1726882441.33634: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.33645: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.33647: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.33650: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.34483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.35430: done with get_vars() 12081 1726882441.35446: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12081 1726882441.35502: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:01 -0400 (0:00:00.067) 0:01:01.158 ****** 12081 1726882441.35527: entering _queue_task() for managed_node3/yum 12081 1726882441.35761: worker is 1 (out of 1 available) 12081 1726882441.35774: exiting _queue_task() for managed_node3/yum 12081 1726882441.35787: done queuing things up, now waiting for results queue to drain 12081 1726882441.35788: waiting for pending results... 12081 1726882441.35990: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12081 1726882441.36086: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e10 12081 1726882441.36097: variable 'ansible_search_path' from source: unknown 12081 1726882441.36101: variable 'ansible_search_path' from source: unknown 12081 1726882441.36131: calling self._execute() 12081 1726882441.36206: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.36210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.36219: variable 'omit' from source: magic vars 12081 1726882441.36500: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.36516: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.36636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882441.38523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882441.38569: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882441.38609: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882441.38634: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882441.38654: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882441.38722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.38742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.38762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.38793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.38805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.38876: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.38889: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12081 1726882441.38892: when evaluation is False, skipping this task 12081 1726882441.38894: _execute() done 12081 1726882441.38897: dumping result to json 12081 1726882441.38899: done dumping result, returning 12081 1726882441.38908: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000e10] 12081 1726882441.38912: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e10 12081 1726882441.39006: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e10 12081 1726882441.39008: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12081 1726882441.39066: no more pending results, returning what we have 12081 1726882441.39069: results queue empty 12081 1726882441.39070: checking for any_errors_fatal 12081 1726882441.39077: done checking for any_errors_fatal 12081 1726882441.39077: checking for max_fail_percentage 12081 1726882441.39079: done checking for max_fail_percentage 12081 1726882441.39080: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.39081: done checking to see if all hosts have failed 12081 1726882441.39082: getting the remaining hosts for this loop 12081 1726882441.39083: done getting the remaining hosts for this loop 12081 1726882441.39087: getting the next task for host managed_node3 12081 1726882441.39095: done getting next task for host managed_node3 12081 1726882441.39099: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882441.39106: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.39135: getting variables 12081 1726882441.39137: in VariableManager get_vars() 12081 1726882441.39181: Calling all_inventory to load vars for managed_node3 12081 1726882441.39183: Calling groups_inventory to load vars for managed_node3 12081 1726882441.39185: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.39195: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.39197: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.39200: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.40152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.41087: done with get_vars() 12081 1726882441.41104: done getting variables 12081 1726882441.41146: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:01 -0400 (0:00:00.056) 0:01:01.214 ****** 12081 1726882441.41173: entering _queue_task() for managed_node3/fail 12081 1726882441.41404: worker is 1 (out of 1 available) 12081 1726882441.41419: exiting _queue_task() for managed_node3/fail 12081 1726882441.41431: done queuing things up, now waiting for results queue to drain 12081 1726882441.41433: waiting for pending results... 12081 1726882441.41628: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12081 1726882441.41729: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e11 12081 1726882441.41741: variable 'ansible_search_path' from source: unknown 12081 1726882441.41745: variable 'ansible_search_path' from source: unknown 12081 1726882441.41780: calling self._execute() 12081 1726882441.41853: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.41858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.41873: variable 'omit' from source: magic vars 12081 1726882441.42149: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.42162: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.42249: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882441.42391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882441.44022: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882441.44080: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882441.44106: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882441.44131: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882441.44155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882441.44213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.44233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.44252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.44285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.44297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.44328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.44343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.44364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.44394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.44404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.44431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.44446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.44469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.44495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.44505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.44618: variable 'network_connections' from source: task vars 12081 1726882441.44627: variable 'port2_profile' from source: play vars 12081 1726882441.44680: variable 'port2_profile' from source: play vars 12081 1726882441.44691: variable 'port1_profile' from source: play vars 12081 1726882441.44735: variable 'port1_profile' from source: play vars 12081 1726882441.44743: variable 'controller_profile' from source: play vars 12081 1726882441.44788: variable 'controller_profile' from source: play vars 12081 1726882441.44840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882441.44967: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882441.44994: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882441.45017: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882441.45040: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882441.45076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882441.45091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882441.45108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.45128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882441.45172: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882441.45326: variable 'network_connections' from source: task vars 12081 1726882441.45330: variable 'port2_profile' from source: play vars 12081 1726882441.45380: variable 'port2_profile' from source: play vars 12081 1726882441.45387: variable 'port1_profile' from source: play vars 12081 1726882441.45428: variable 'port1_profile' from source: play vars 12081 1726882441.45434: variable 'controller_profile' from source: play vars 12081 1726882441.45482: variable 'controller_profile' from source: play vars 12081 1726882441.45501: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882441.45512: when evaluation is False, skipping this task 12081 1726882441.45514: _execute() done 12081 1726882441.45517: dumping result to json 12081 1726882441.45519: done dumping result, returning 12081 1726882441.45522: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000e11] 12081 1726882441.45524: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e11 12081 1726882441.45624: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e11 12081 1726882441.45628: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882441.45684: no more pending results, returning what we have 12081 1726882441.45688: results queue empty 12081 1726882441.45689: checking for any_errors_fatal 12081 1726882441.45695: done checking for any_errors_fatal 12081 1726882441.45695: checking for max_fail_percentage 12081 1726882441.45697: done checking for max_fail_percentage 12081 1726882441.45698: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.45699: done checking to see if all hosts have failed 12081 1726882441.45700: getting the remaining hosts for this loop 12081 1726882441.45702: done getting the remaining hosts for this loop 12081 1726882441.45706: getting the next task for host managed_node3 12081 1726882441.45714: done getting next task for host managed_node3 12081 1726882441.45718: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12081 1726882441.45724: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.45746: getting variables 12081 1726882441.45747: in VariableManager get_vars() 12081 1726882441.45797: Calling all_inventory to load vars for managed_node3 12081 1726882441.45800: Calling groups_inventory to load vars for managed_node3 12081 1726882441.45802: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.45811: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.45814: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.45816: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.46650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.47712: done with get_vars() 12081 1726882441.47730: done getting variables 12081 1726882441.47777: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:01 -0400 (0:00:00.066) 0:01:01.281 ****** 12081 1726882441.47804: entering _queue_task() for managed_node3/package 12081 1726882441.48042: worker is 1 (out of 1 available) 12081 1726882441.48054: exiting _queue_task() for managed_node3/package 12081 1726882441.48069: done queuing things up, now waiting for results queue to drain 12081 1726882441.48071: waiting for pending results... 12081 1726882441.48283: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12081 1726882441.48382: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e12 12081 1726882441.48394: variable 'ansible_search_path' from source: unknown 12081 1726882441.48399: variable 'ansible_search_path' from source: unknown 12081 1726882441.48429: calling self._execute() 12081 1726882441.48509: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.48512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.48523: variable 'omit' from source: magic vars 12081 1726882441.48800: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.48811: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.48953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882441.49152: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882441.49195: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882441.49221: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882441.49289: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882441.49376: variable 'network_packages' from source: role '' defaults 12081 1726882441.49450: variable '__network_provider_setup' from source: role '' defaults 12081 1726882441.49459: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882441.49513: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882441.49521: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882441.49566: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882441.49685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882441.51136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882441.51180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882441.51208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882441.51236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882441.51259: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882441.51329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.51350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.51371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.51398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.51408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.51440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.51458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.51482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.51507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.51517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.51675: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882441.51748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.51767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.51787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.51812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.51822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.51890: variable 'ansible_python' from source: facts 12081 1726882441.51904: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882441.51964: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882441.52021: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882441.52107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.52124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.52141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.52168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.52179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.52214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.52233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.52249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.52278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.52289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.52391: variable 'network_connections' from source: task vars 12081 1726882441.52395: variable 'port2_profile' from source: play vars 12081 1726882441.52468: variable 'port2_profile' from source: play vars 12081 1726882441.52478: variable 'port1_profile' from source: play vars 12081 1726882441.52547: variable 'port1_profile' from source: play vars 12081 1726882441.52558: variable 'controller_profile' from source: play vars 12081 1726882441.52628: variable 'controller_profile' from source: play vars 12081 1726882441.52681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882441.52700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882441.52720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.52742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882441.52785: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882441.52960: variable 'network_connections' from source: task vars 12081 1726882441.52971: variable 'port2_profile' from source: play vars 12081 1726882441.53038: variable 'port2_profile' from source: play vars 12081 1726882441.53046: variable 'port1_profile' from source: play vars 12081 1726882441.53122: variable 'port1_profile' from source: play vars 12081 1726882441.53130: variable 'controller_profile' from source: play vars 12081 1726882441.53202: variable 'controller_profile' from source: play vars 12081 1726882441.53229: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882441.53285: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882441.53490: variable 'network_connections' from source: task vars 12081 1726882441.53493: variable 'port2_profile' from source: play vars 12081 1726882441.53541: variable 'port2_profile' from source: play vars 12081 1726882441.53547: variable 'port1_profile' from source: play vars 12081 1726882441.53594: variable 'port1_profile' from source: play vars 12081 1726882441.53600: variable 'controller_profile' from source: play vars 12081 1726882441.53646: variable 'controller_profile' from source: play vars 12081 1726882441.53666: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882441.53722: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882441.53916: variable 'network_connections' from source: task vars 12081 1726882441.53920: variable 'port2_profile' from source: play vars 12081 1726882441.53970: variable 'port2_profile' from source: play vars 12081 1726882441.53976: variable 'port1_profile' from source: play vars 12081 1726882441.54020: variable 'port1_profile' from source: play vars 12081 1726882441.54027: variable 'controller_profile' from source: play vars 12081 1726882441.54075: variable 'controller_profile' from source: play vars 12081 1726882441.54114: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882441.54160: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882441.54168: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882441.54207: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882441.54342: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882441.54660: variable 'network_connections' from source: task vars 12081 1726882441.54666: variable 'port2_profile' from source: play vars 12081 1726882441.54708: variable 'port2_profile' from source: play vars 12081 1726882441.54715: variable 'port1_profile' from source: play vars 12081 1726882441.54759: variable 'port1_profile' from source: play vars 12081 1726882441.54763: variable 'controller_profile' from source: play vars 12081 1726882441.54805: variable 'controller_profile' from source: play vars 12081 1726882441.54813: variable 'ansible_distribution' from source: facts 12081 1726882441.54816: variable '__network_rh_distros' from source: role '' defaults 12081 1726882441.54824: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.54835: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882441.54943: variable 'ansible_distribution' from source: facts 12081 1726882441.54947: variable '__network_rh_distros' from source: role '' defaults 12081 1726882441.54952: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.54965: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882441.55074: variable 'ansible_distribution' from source: facts 12081 1726882441.55078: variable '__network_rh_distros' from source: role '' defaults 12081 1726882441.55082: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.55109: variable 'network_provider' from source: set_fact 12081 1726882441.55120: variable 'ansible_facts' from source: unknown 12081 1726882441.55595: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12081 1726882441.55599: when evaluation is False, skipping this task 12081 1726882441.55601: _execute() done 12081 1726882441.55604: dumping result to json 12081 1726882441.55606: done dumping result, returning 12081 1726882441.55614: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-0a3f-ff3c-000000000e12] 12081 1726882441.55619: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e12 12081 1726882441.55720: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e12 12081 1726882441.55723: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12081 1726882441.55774: no more pending results, returning what we have 12081 1726882441.55778: results queue empty 12081 1726882441.55779: checking for any_errors_fatal 12081 1726882441.55792: done checking for any_errors_fatal 12081 1726882441.55792: checking for max_fail_percentage 12081 1726882441.55798: done checking for max_fail_percentage 12081 1726882441.55799: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.55800: done checking to see if all hosts have failed 12081 1726882441.55801: getting the remaining hosts for this loop 12081 1726882441.55803: done getting the remaining hosts for this loop 12081 1726882441.55812: getting the next task for host managed_node3 12081 1726882441.55820: done getting next task for host managed_node3 12081 1726882441.55824: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882441.55829: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.55851: getting variables 12081 1726882441.55853: in VariableManager get_vars() 12081 1726882441.55906: Calling all_inventory to load vars for managed_node3 12081 1726882441.55909: Calling groups_inventory to load vars for managed_node3 12081 1726882441.55912: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.55921: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.55924: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.55926: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.56776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.57769: done with get_vars() 12081 1726882441.57790: done getting variables 12081 1726882441.57835: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:01 -0400 (0:00:00.100) 0:01:01.381 ****** 12081 1726882441.57869: entering _queue_task() for managed_node3/package 12081 1726882441.58138: worker is 1 (out of 1 available) 12081 1726882441.58151: exiting _queue_task() for managed_node3/package 12081 1726882441.58169: done queuing things up, now waiting for results queue to drain 12081 1726882441.58170: waiting for pending results... 12081 1726882441.58361: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12081 1726882441.58465: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e13 12081 1726882441.58478: variable 'ansible_search_path' from source: unknown 12081 1726882441.58481: variable 'ansible_search_path' from source: unknown 12081 1726882441.58513: calling self._execute() 12081 1726882441.58590: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.58594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.58606: variable 'omit' from source: magic vars 12081 1726882441.58881: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.58892: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.58978: variable 'network_state' from source: role '' defaults 12081 1726882441.58986: Evaluated conditional (network_state != {}): False 12081 1726882441.58990: when evaluation is False, skipping this task 12081 1726882441.58992: _execute() done 12081 1726882441.58995: dumping result to json 12081 1726882441.58997: done dumping result, returning 12081 1726882441.59004: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-000000000e13] 12081 1726882441.59010: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e13 12081 1726882441.59109: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e13 12081 1726882441.59112: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882441.59157: no more pending results, returning what we have 12081 1726882441.59160: results queue empty 12081 1726882441.59161: checking for any_errors_fatal 12081 1726882441.59170: done checking for any_errors_fatal 12081 1726882441.59171: checking for max_fail_percentage 12081 1726882441.59173: done checking for max_fail_percentage 12081 1726882441.59174: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.59180: done checking to see if all hosts have failed 12081 1726882441.59181: getting the remaining hosts for this loop 12081 1726882441.59183: done getting the remaining hosts for this loop 12081 1726882441.59187: getting the next task for host managed_node3 12081 1726882441.59195: done getting next task for host managed_node3 12081 1726882441.59198: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882441.59204: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.59225: getting variables 12081 1726882441.59227: in VariableManager get_vars() 12081 1726882441.59267: Calling all_inventory to load vars for managed_node3 12081 1726882441.59270: Calling groups_inventory to load vars for managed_node3 12081 1726882441.59272: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.59281: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.59290: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.59293: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.60241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.61832: done with get_vars() 12081 1726882441.61856: done getting variables 12081 1726882441.61903: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:01 -0400 (0:00:00.040) 0:01:01.422 ****** 12081 1726882441.61930: entering _queue_task() for managed_node3/package 12081 1726882441.62178: worker is 1 (out of 1 available) 12081 1726882441.62191: exiting _queue_task() for managed_node3/package 12081 1726882441.62205: done queuing things up, now waiting for results queue to drain 12081 1726882441.62207: waiting for pending results... 12081 1726882441.62400: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12081 1726882441.62511: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e14 12081 1726882441.62522: variable 'ansible_search_path' from source: unknown 12081 1726882441.62526: variable 'ansible_search_path' from source: unknown 12081 1726882441.62560: calling self._execute() 12081 1726882441.62636: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.62640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.62650: variable 'omit' from source: magic vars 12081 1726882441.62923: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.62933: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.63020: variable 'network_state' from source: role '' defaults 12081 1726882441.63029: Evaluated conditional (network_state != {}): False 12081 1726882441.63032: when evaluation is False, skipping this task 12081 1726882441.63034: _execute() done 12081 1726882441.63037: dumping result to json 12081 1726882441.63041: done dumping result, returning 12081 1726882441.63049: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-0a3f-ff3c-000000000e14] 12081 1726882441.63055: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e14 12081 1726882441.63149: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e14 12081 1726882441.63152: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882441.63228: no more pending results, returning what we have 12081 1726882441.63231: results queue empty 12081 1726882441.63232: checking for any_errors_fatal 12081 1726882441.63238: done checking for any_errors_fatal 12081 1726882441.63238: checking for max_fail_percentage 12081 1726882441.63240: done checking for max_fail_percentage 12081 1726882441.63241: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.63241: done checking to see if all hosts have failed 12081 1726882441.63242: getting the remaining hosts for this loop 12081 1726882441.63244: done getting the remaining hosts for this loop 12081 1726882441.63247: getting the next task for host managed_node3 12081 1726882441.63254: done getting next task for host managed_node3 12081 1726882441.63258: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882441.63269: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.63291: getting variables 12081 1726882441.63293: in VariableManager get_vars() 12081 1726882441.63330: Calling all_inventory to load vars for managed_node3 12081 1726882441.63332: Calling groups_inventory to load vars for managed_node3 12081 1726882441.63334: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.63342: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.63344: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.63346: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.64748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.66602: done with get_vars() 12081 1726882441.66640: done getting variables 12081 1726882441.66708: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:01 -0400 (0:00:00.048) 0:01:01.470 ****** 12081 1726882441.66750: entering _queue_task() for managed_node3/service 12081 1726882441.67109: worker is 1 (out of 1 available) 12081 1726882441.67121: exiting _queue_task() for managed_node3/service 12081 1726882441.67134: done queuing things up, now waiting for results queue to drain 12081 1726882441.67135: waiting for pending results... 12081 1726882441.67451: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12081 1726882441.67604: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e15 12081 1726882441.67617: variable 'ansible_search_path' from source: unknown 12081 1726882441.67621: variable 'ansible_search_path' from source: unknown 12081 1726882441.67660: calling self._execute() 12081 1726882441.67768: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.67772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.67782: variable 'omit' from source: magic vars 12081 1726882441.68160: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.68175: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.68308: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882441.68544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882441.71454: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882441.71529: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882441.71578: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882441.71619: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882441.71644: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882441.71733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.71764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.71791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.71838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.71852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.71901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.71922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.71953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.71996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.72009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.72054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.72081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.72104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.72140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.72164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.72343: variable 'network_connections' from source: task vars 12081 1726882441.72354: variable 'port2_profile' from source: play vars 12081 1726882441.72432: variable 'port2_profile' from source: play vars 12081 1726882441.72443: variable 'port1_profile' from source: play vars 12081 1726882441.72512: variable 'port1_profile' from source: play vars 12081 1726882441.72520: variable 'controller_profile' from source: play vars 12081 1726882441.72587: variable 'controller_profile' from source: play vars 12081 1726882441.72662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882441.72842: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882441.72883: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882441.72930: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882441.72960: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882441.73003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882441.73031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882441.73055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.73086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882441.73144: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882441.73403: variable 'network_connections' from source: task vars 12081 1726882441.73408: variable 'port2_profile' from source: play vars 12081 1726882441.73478: variable 'port2_profile' from source: play vars 12081 1726882441.73485: variable 'port1_profile' from source: play vars 12081 1726882441.73541: variable 'port1_profile' from source: play vars 12081 1726882441.73548: variable 'controller_profile' from source: play vars 12081 1726882441.73617: variable 'controller_profile' from source: play vars 12081 1726882441.73640: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12081 1726882441.73651: when evaluation is False, skipping this task 12081 1726882441.73654: _execute() done 12081 1726882441.73656: dumping result to json 12081 1726882441.73658: done dumping result, returning 12081 1726882441.73667: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000e15] 12081 1726882441.73678: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e15 12081 1726882441.73779: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e15 12081 1726882441.73783: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12081 1726882441.73829: no more pending results, returning what we have 12081 1726882441.73833: results queue empty 12081 1726882441.73833: checking for any_errors_fatal 12081 1726882441.73839: done checking for any_errors_fatal 12081 1726882441.73840: checking for max_fail_percentage 12081 1726882441.73841: done checking for max_fail_percentage 12081 1726882441.73842: checking to see if all hosts have failed and the running result is not ok 12081 1726882441.73843: done checking to see if all hosts have failed 12081 1726882441.73844: getting the remaining hosts for this loop 12081 1726882441.73846: done getting the remaining hosts for this loop 12081 1726882441.73849: getting the next task for host managed_node3 12081 1726882441.73860: done getting next task for host managed_node3 12081 1726882441.73866: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882441.73872: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882441.73894: getting variables 12081 1726882441.73896: in VariableManager get_vars() 12081 1726882441.73940: Calling all_inventory to load vars for managed_node3 12081 1726882441.73942: Calling groups_inventory to load vars for managed_node3 12081 1726882441.73945: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882441.73958: Calling all_plugins_play to load vars for managed_node3 12081 1726882441.73960: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882441.73963: Calling groups_plugins_play to load vars for managed_node3 12081 1726882441.75990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882441.77789: done with get_vars() 12081 1726882441.77825: done getting variables 12081 1726882441.77896: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:01 -0400 (0:00:00.111) 0:01:01.582 ****** 12081 1726882441.77931: entering _queue_task() for managed_node3/service 12081 1726882441.78289: worker is 1 (out of 1 available) 12081 1726882441.78304: exiting _queue_task() for managed_node3/service 12081 1726882441.78317: done queuing things up, now waiting for results queue to drain 12081 1726882441.78318: waiting for pending results... 12081 1726882441.78626: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12081 1726882441.78784: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e16 12081 1726882441.78797: variable 'ansible_search_path' from source: unknown 12081 1726882441.78801: variable 'ansible_search_path' from source: unknown 12081 1726882441.78836: calling self._execute() 12081 1726882441.78939: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.78948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.78961: variable 'omit' from source: magic vars 12081 1726882441.79344: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.79359: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882441.79537: variable 'network_provider' from source: set_fact 12081 1726882441.79541: variable 'network_state' from source: role '' defaults 12081 1726882441.79551: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12081 1726882441.79560: variable 'omit' from source: magic vars 12081 1726882441.79640: variable 'omit' from source: magic vars 12081 1726882441.79670: variable 'network_service_name' from source: role '' defaults 12081 1726882441.79742: variable 'network_service_name' from source: role '' defaults 12081 1726882441.79856: variable '__network_provider_setup' from source: role '' defaults 12081 1726882441.79867: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882441.79931: variable '__network_service_name_default_nm' from source: role '' defaults 12081 1726882441.79939: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882441.80006: variable '__network_packages_default_nm' from source: role '' defaults 12081 1726882441.80251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882441.82793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882441.83370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882441.83375: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882441.83377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882441.83380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882441.83383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.83386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.83388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.83390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.83393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.83395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.83397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.83399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.83401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.83403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.83617: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12081 1726882441.83729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.83762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.83785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.83824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.83835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.83928: variable 'ansible_python' from source: facts 12081 1726882441.83944: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12081 1726882441.84036: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882441.84114: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882441.84245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.84273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.84304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.84342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.84355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.84411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882441.84434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882441.84460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.84499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882441.84520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882441.84673: variable 'network_connections' from source: task vars 12081 1726882441.84679: variable 'port2_profile' from source: play vars 12081 1726882441.84762: variable 'port2_profile' from source: play vars 12081 1726882441.84776: variable 'port1_profile' from source: play vars 12081 1726882441.84853: variable 'port1_profile' from source: play vars 12081 1726882441.84868: variable 'controller_profile' from source: play vars 12081 1726882441.84939: variable 'controller_profile' from source: play vars 12081 1726882441.85071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882441.85287: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882441.85334: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882441.85383: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882441.85425: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882441.85491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882441.85523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882441.85553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882441.85589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882441.85647: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882441.85956: variable 'network_connections' from source: task vars 12081 1726882441.85966: variable 'port2_profile' from source: play vars 12081 1726882441.86045: variable 'port2_profile' from source: play vars 12081 1726882441.86058: variable 'port1_profile' from source: play vars 12081 1726882441.86128: variable 'port1_profile' from source: play vars 12081 1726882441.86143: variable 'controller_profile' from source: play vars 12081 1726882441.86219: variable 'controller_profile' from source: play vars 12081 1726882441.86255: variable '__network_packages_default_wireless' from source: role '' defaults 12081 1726882441.86341: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882441.86645: variable 'network_connections' from source: task vars 12081 1726882441.86648: variable 'port2_profile' from source: play vars 12081 1726882441.86728: variable 'port2_profile' from source: play vars 12081 1726882441.86735: variable 'port1_profile' from source: play vars 12081 1726882441.86812: variable 'port1_profile' from source: play vars 12081 1726882441.86819: variable 'controller_profile' from source: play vars 12081 1726882441.86890: variable 'controller_profile' from source: play vars 12081 1726882441.86919: variable '__network_packages_default_team' from source: role '' defaults 12081 1726882441.86999: variable '__network_team_connections_defined' from source: role '' defaults 12081 1726882441.87306: variable 'network_connections' from source: task vars 12081 1726882441.87310: variable 'port2_profile' from source: play vars 12081 1726882441.87379: variable 'port2_profile' from source: play vars 12081 1726882441.87385: variable 'port1_profile' from source: play vars 12081 1726882441.87441: variable 'port1_profile' from source: play vars 12081 1726882441.87454: variable 'controller_profile' from source: play vars 12081 1726882441.87517: variable 'controller_profile' from source: play vars 12081 1726882441.87580: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882441.87640: variable '__network_service_name_default_initscripts' from source: role '' defaults 12081 1726882441.87646: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882441.87717: variable '__network_packages_default_initscripts' from source: role '' defaults 12081 1726882441.87951: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12081 1726882441.88507: variable 'network_connections' from source: task vars 12081 1726882441.88510: variable 'port2_profile' from source: play vars 12081 1726882441.88580: variable 'port2_profile' from source: play vars 12081 1726882441.88588: variable 'port1_profile' from source: play vars 12081 1726882441.88645: variable 'port1_profile' from source: play vars 12081 1726882441.88665: variable 'controller_profile' from source: play vars 12081 1726882441.88720: variable 'controller_profile' from source: play vars 12081 1726882441.88730: variable 'ansible_distribution' from source: facts 12081 1726882441.88733: variable '__network_rh_distros' from source: role '' defaults 12081 1726882441.88738: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.88754: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12081 1726882441.88935: variable 'ansible_distribution' from source: facts 12081 1726882441.88938: variable '__network_rh_distros' from source: role '' defaults 12081 1726882441.88946: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.88960: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12081 1726882441.89568: variable 'ansible_distribution' from source: facts 12081 1726882441.89571: variable '__network_rh_distros' from source: role '' defaults 12081 1726882441.89573: variable 'ansible_distribution_major_version' from source: facts 12081 1726882441.89576: variable 'network_provider' from source: set_fact 12081 1726882441.89577: variable 'omit' from source: magic vars 12081 1726882441.89579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882441.89582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882441.89584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882441.89586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882441.89588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882441.89590: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882441.89592: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.89594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.89600: Set connection var ansible_pipelining to False 12081 1726882441.89602: Set connection var ansible_shell_type to sh 12081 1726882441.89604: Set connection var ansible_shell_executable to /bin/sh 12081 1726882441.89606: Set connection var ansible_connection to ssh 12081 1726882441.89608: Set connection var ansible_timeout to 10 12081 1726882441.89610: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882441.89612: variable 'ansible_shell_executable' from source: unknown 12081 1726882441.89614: variable 'ansible_connection' from source: unknown 12081 1726882441.89616: variable 'ansible_module_compression' from source: unknown 12081 1726882441.89618: variable 'ansible_shell_type' from source: unknown 12081 1726882441.89620: variable 'ansible_shell_executable' from source: unknown 12081 1726882441.89622: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882441.89624: variable 'ansible_pipelining' from source: unknown 12081 1726882441.89626: variable 'ansible_timeout' from source: unknown 12081 1726882441.89628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882441.89674: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882441.89683: variable 'omit' from source: magic vars 12081 1726882441.89688: starting attempt loop 12081 1726882441.89691: running the handler 12081 1726882441.89779: variable 'ansible_facts' from source: unknown 12081 1726882441.90628: _low_level_execute_command(): starting 12081 1726882441.90636: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882441.91399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882441.91411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882441.91426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882441.91439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882441.91488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882441.91498: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882441.91508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882441.91521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882441.91531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882441.91538: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882441.91547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882441.91562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882441.91575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882441.91583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882441.91589: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882441.91599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882441.91683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882441.91698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882441.91701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882441.91837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882441.93551: stdout chunk (state=3): >>>/root <<< 12081 1726882441.93753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882441.93756: stdout chunk (state=3): >>><<< 12081 1726882441.93759: stderr chunk (state=3): >>><<< 12081 1726882441.93888: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882441.93892: _low_level_execute_command(): starting 12081 1726882441.93896: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653 `" && echo ansible-tmp-1726882441.937953-14835-75231958347653="` echo /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653 `" ) && sleep 0' 12081 1726882441.94611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882441.94626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882441.94640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882441.94662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882441.94708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882441.94720: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882441.94733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882441.94749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882441.94760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882441.94778: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882441.94793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882441.94806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882441.94821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882441.94831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882441.94841: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882441.94852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882441.94936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882441.94953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882441.94970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882441.95134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882441.96996: stdout chunk (state=3): >>>ansible-tmp-1726882441.937953-14835-75231958347653=/root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653 <<< 12081 1726882441.97110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882441.97229: stderr chunk (state=3): >>><<< 12081 1726882441.97242: stdout chunk (state=3): >>><<< 12081 1726882441.97575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882441.937953-14835-75231958347653=/root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882441.97578: variable 'ansible_module_compression' from source: unknown 12081 1726882441.97581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12081 1726882441.97583: variable 'ansible_facts' from source: unknown 12081 1726882441.97617: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653/AnsiballZ_systemd.py 12081 1726882441.97786: Sending initial data 12081 1726882441.97790: Sent initial data (154 bytes) 12081 1726882441.98681: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882441.98700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882441.98704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882441.98739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882441.98743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882441.98745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882441.98747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882441.98795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882441.98804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882441.98918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.00683: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12081 1726882442.00690: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12081 1726882442.00697: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12081 1726882442.00704: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 12081 1726882442.00710: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 12081 1726882442.00719: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 12081 1726882442.00722: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 12081 1726882442.00729: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 12081 1726882442.00747: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882442.00851: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882442.00957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpjoe4nwy2 /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653/AnsiballZ_systemd.py <<< 12081 1726882442.01082: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882442.03348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882442.03449: stderr chunk (state=3): >>><<< 12081 1726882442.03452: stdout chunk (state=3): >>><<< 12081 1726882442.03471: done transferring module to remote 12081 1726882442.03481: _low_level_execute_command(): starting 12081 1726882442.03490: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653/ /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653/AnsiballZ_systemd.py && sleep 0' 12081 1726882442.03998: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.04003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.04044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882442.04060: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882442.04076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.04093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882442.04103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882442.04113: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882442.04123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.04139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.04156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.04171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882442.04186: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882442.04199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.04288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882442.04309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882442.04328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.04463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.06342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882442.06397: stderr chunk (state=3): >>><<< 12081 1726882442.06400: stdout chunk (state=3): >>><<< 12081 1726882442.06412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882442.06417: _low_level_execute_command(): starting 12081 1726882442.06419: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653/AnsiballZ_systemd.py && sleep 0' 12081 1726882442.06860: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.06866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.06911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.06914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.06917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.06985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882442.06988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.07092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.32512: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "14041088", "MemoryAvailable": "infinity", "CPUUsageNSec": "1137208000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0<<< 12081 1726882442.32522: stdout chunk (state=3): >>>", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12081 1726882442.34171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882442.34182: stderr chunk (state=3): >>><<< 12081 1726882442.34185: stdout chunk (state=3): >>><<< 12081 1726882442.34473: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "14041088", "MemoryAvailable": "infinity", "CPUUsageNSec": "1137208000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882442.34483: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882442.34486: _low_level_execute_command(): starting 12081 1726882442.34488: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882441.937953-14835-75231958347653/ > /dev/null 2>&1 && sleep 0' 12081 1726882442.35497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882442.35501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.35519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.35522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.35568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882442.35577: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882442.35586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.35606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882442.35609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882442.35620: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882442.35626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.35635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.35646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.35653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882442.35665: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882442.35713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.35751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882442.35766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882442.35776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.35899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.37988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882442.38054: stderr chunk (state=3): >>><<< 12081 1726882442.38057: stdout chunk (state=3): >>><<< 12081 1726882442.38079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882442.38083: handler run complete 12081 1726882442.38126: attempt loop complete, returning result 12081 1726882442.38130: _execute() done 12081 1726882442.38132: dumping result to json 12081 1726882442.38142: done dumping result, returning 12081 1726882442.38151: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-0a3f-ff3c-000000000e16] 12081 1726882442.38158: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e16 12081 1726882442.38522: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e16 12081 1726882442.38525: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882442.38587: no more pending results, returning what we have 12081 1726882442.38590: results queue empty 12081 1726882442.38590: checking for any_errors_fatal 12081 1726882442.38596: done checking for any_errors_fatal 12081 1726882442.38596: checking for max_fail_percentage 12081 1726882442.38598: done checking for max_fail_percentage 12081 1726882442.38599: checking to see if all hosts have failed and the running result is not ok 12081 1726882442.38600: done checking to see if all hosts have failed 12081 1726882442.38601: getting the remaining hosts for this loop 12081 1726882442.38602: done getting the remaining hosts for this loop 12081 1726882442.38605: getting the next task for host managed_node3 12081 1726882442.38612: done getting next task for host managed_node3 12081 1726882442.38615: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882442.38620: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882442.38630: getting variables 12081 1726882442.38631: in VariableManager get_vars() 12081 1726882442.38674: Calling all_inventory to load vars for managed_node3 12081 1726882442.38677: Calling groups_inventory to load vars for managed_node3 12081 1726882442.38679: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882442.38688: Calling all_plugins_play to load vars for managed_node3 12081 1726882442.38690: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882442.38692: Calling groups_plugins_play to load vars for managed_node3 12081 1726882442.39500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882442.41859: done with get_vars() 12081 1726882442.41885: done getting variables 12081 1726882442.41950: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:02 -0400 (0:00:00.640) 0:01:02.222 ****** 12081 1726882442.41994: entering _queue_task() for managed_node3/service 12081 1726882442.42351: worker is 1 (out of 1 available) 12081 1726882442.42371: exiting _queue_task() for managed_node3/service 12081 1726882442.42389: done queuing things up, now waiting for results queue to drain 12081 1726882442.42391: waiting for pending results... 12081 1726882442.42639: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12081 1726882442.42748: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e17 12081 1726882442.42763: variable 'ansible_search_path' from source: unknown 12081 1726882442.42769: variable 'ansible_search_path' from source: unknown 12081 1726882442.42797: calling self._execute() 12081 1726882442.42873: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882442.42877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882442.42885: variable 'omit' from source: magic vars 12081 1726882442.43150: variable 'ansible_distribution_major_version' from source: facts 12081 1726882442.43165: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882442.43241: variable 'network_provider' from source: set_fact 12081 1726882442.43245: Evaluated conditional (network_provider == "nm"): True 12081 1726882442.43313: variable '__network_wpa_supplicant_required' from source: role '' defaults 12081 1726882442.43379: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12081 1726882442.43500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882442.45514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882442.45572: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882442.45606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882442.45631: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882442.45651: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882442.45972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882442.45976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882442.45978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882442.45981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882442.45983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882442.45985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882442.45987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882442.46019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882442.46037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882442.46051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882442.46094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882442.46123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882442.46148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882442.46193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882442.46208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882442.46371: variable 'network_connections' from source: task vars 12081 1726882442.46383: variable 'port2_profile' from source: play vars 12081 1726882442.46454: variable 'port2_profile' from source: play vars 12081 1726882442.46470: variable 'port1_profile' from source: play vars 12081 1726882442.46526: variable 'port1_profile' from source: play vars 12081 1726882442.46534: variable 'controller_profile' from source: play vars 12081 1726882442.46602: variable 'controller_profile' from source: play vars 12081 1726882442.46681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12081 1726882442.46945: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12081 1726882442.47009: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12081 1726882442.47038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12081 1726882442.47063: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12081 1726882442.47117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12081 1726882442.47141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12081 1726882442.47162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882442.47188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12081 1726882442.47232: variable '__network_wireless_connections_defined' from source: role '' defaults 12081 1726882442.47406: variable 'network_connections' from source: task vars 12081 1726882442.47413: variable 'port2_profile' from source: play vars 12081 1726882442.47457: variable 'port2_profile' from source: play vars 12081 1726882442.47469: variable 'port1_profile' from source: play vars 12081 1726882442.47515: variable 'port1_profile' from source: play vars 12081 1726882442.47534: variable 'controller_profile' from source: play vars 12081 1726882442.47578: variable 'controller_profile' from source: play vars 12081 1726882442.47601: Evaluated conditional (__network_wpa_supplicant_required): False 12081 1726882442.47604: when evaluation is False, skipping this task 12081 1726882442.47607: _execute() done 12081 1726882442.47609: dumping result to json 12081 1726882442.47611: done dumping result, returning 12081 1726882442.47619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-0a3f-ff3c-000000000e17] 12081 1726882442.47624: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e17 12081 1726882442.47711: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e17 12081 1726882442.47714: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12081 1726882442.47785: no more pending results, returning what we have 12081 1726882442.47789: results queue empty 12081 1726882442.47790: checking for any_errors_fatal 12081 1726882442.47818: done checking for any_errors_fatal 12081 1726882442.47818: checking for max_fail_percentage 12081 1726882442.47820: done checking for max_fail_percentage 12081 1726882442.47821: checking to see if all hosts have failed and the running result is not ok 12081 1726882442.47822: done checking to see if all hosts have failed 12081 1726882442.47823: getting the remaining hosts for this loop 12081 1726882442.47825: done getting the remaining hosts for this loop 12081 1726882442.47828: getting the next task for host managed_node3 12081 1726882442.47837: done getting next task for host managed_node3 12081 1726882442.47842: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882442.47848: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882442.47872: getting variables 12081 1726882442.47874: in VariableManager get_vars() 12081 1726882442.47915: Calling all_inventory to load vars for managed_node3 12081 1726882442.47918: Calling groups_inventory to load vars for managed_node3 12081 1726882442.47920: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882442.47930: Calling all_plugins_play to load vars for managed_node3 12081 1726882442.47932: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882442.47934: Calling groups_plugins_play to load vars for managed_node3 12081 1726882442.48777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882442.50338: done with get_vars() 12081 1726882442.50358: done getting variables 12081 1726882442.50404: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:02 -0400 (0:00:00.084) 0:01:02.307 ****** 12081 1726882442.50429: entering _queue_task() for managed_node3/service 12081 1726882442.50652: worker is 1 (out of 1 available) 12081 1726882442.50669: exiting _queue_task() for managed_node3/service 12081 1726882442.50682: done queuing things up, now waiting for results queue to drain 12081 1726882442.50684: waiting for pending results... 12081 1726882442.50873: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12081 1726882442.50977: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e18 12081 1726882442.50989: variable 'ansible_search_path' from source: unknown 12081 1726882442.50992: variable 'ansible_search_path' from source: unknown 12081 1726882442.51022: calling self._execute() 12081 1726882442.51098: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882442.51102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882442.51110: variable 'omit' from source: magic vars 12081 1726882442.51383: variable 'ansible_distribution_major_version' from source: facts 12081 1726882442.51394: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882442.51476: variable 'network_provider' from source: set_fact 12081 1726882442.51481: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882442.51484: when evaluation is False, skipping this task 12081 1726882442.51487: _execute() done 12081 1726882442.51489: dumping result to json 12081 1726882442.51491: done dumping result, returning 12081 1726882442.51499: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-0a3f-ff3c-000000000e18] 12081 1726882442.51505: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e18 12081 1726882442.51597: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e18 12081 1726882442.51600: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12081 1726882442.51641: no more pending results, returning what we have 12081 1726882442.51644: results queue empty 12081 1726882442.51644: checking for any_errors_fatal 12081 1726882442.51655: done checking for any_errors_fatal 12081 1726882442.51655: checking for max_fail_percentage 12081 1726882442.51657: done checking for max_fail_percentage 12081 1726882442.51658: checking to see if all hosts have failed and the running result is not ok 12081 1726882442.51659: done checking to see if all hosts have failed 12081 1726882442.51660: getting the remaining hosts for this loop 12081 1726882442.51662: done getting the remaining hosts for this loop 12081 1726882442.51667: getting the next task for host managed_node3 12081 1726882442.51680: done getting next task for host managed_node3 12081 1726882442.51684: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882442.51690: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882442.51711: getting variables 12081 1726882442.51712: in VariableManager get_vars() 12081 1726882442.51749: Calling all_inventory to load vars for managed_node3 12081 1726882442.51751: Calling groups_inventory to load vars for managed_node3 12081 1726882442.51753: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882442.51762: Calling all_plugins_play to load vars for managed_node3 12081 1726882442.51767: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882442.51770: Calling groups_plugins_play to load vars for managed_node3 12081 1726882442.52706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882442.53645: done with get_vars() 12081 1726882442.53669: done getting variables 12081 1726882442.53713: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:02 -0400 (0:00:00.033) 0:01:02.340 ****** 12081 1726882442.53742: entering _queue_task() for managed_node3/copy 12081 1726882442.53989: worker is 1 (out of 1 available) 12081 1726882442.54002: exiting _queue_task() for managed_node3/copy 12081 1726882442.54016: done queuing things up, now waiting for results queue to drain 12081 1726882442.54017: waiting for pending results... 12081 1726882442.54209: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12081 1726882442.54322: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e19 12081 1726882442.54333: variable 'ansible_search_path' from source: unknown 12081 1726882442.54336: variable 'ansible_search_path' from source: unknown 12081 1726882442.54372: calling self._execute() 12081 1726882442.54447: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882442.54451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882442.54461: variable 'omit' from source: magic vars 12081 1726882442.54733: variable 'ansible_distribution_major_version' from source: facts 12081 1726882442.54743: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882442.54826: variable 'network_provider' from source: set_fact 12081 1726882442.54830: Evaluated conditional (network_provider == "initscripts"): False 12081 1726882442.54834: when evaluation is False, skipping this task 12081 1726882442.54836: _execute() done 12081 1726882442.54839: dumping result to json 12081 1726882442.54842: done dumping result, returning 12081 1726882442.54850: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-0a3f-ff3c-000000000e19] 12081 1726882442.54858: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e19 12081 1726882442.54952: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e19 12081 1726882442.54957: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12081 1726882442.55007: no more pending results, returning what we have 12081 1726882442.55012: results queue empty 12081 1726882442.55012: checking for any_errors_fatal 12081 1726882442.55020: done checking for any_errors_fatal 12081 1726882442.55021: checking for max_fail_percentage 12081 1726882442.55022: done checking for max_fail_percentage 12081 1726882442.55023: checking to see if all hosts have failed and the running result is not ok 12081 1726882442.55024: done checking to see if all hosts have failed 12081 1726882442.55025: getting the remaining hosts for this loop 12081 1726882442.55027: done getting the remaining hosts for this loop 12081 1726882442.55030: getting the next task for host managed_node3 12081 1726882442.55039: done getting next task for host managed_node3 12081 1726882442.55044: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882442.55048: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882442.55079: getting variables 12081 1726882442.55081: in VariableManager get_vars() 12081 1726882442.55119: Calling all_inventory to load vars for managed_node3 12081 1726882442.55121: Calling groups_inventory to load vars for managed_node3 12081 1726882442.55123: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882442.55132: Calling all_plugins_play to load vars for managed_node3 12081 1726882442.55134: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882442.55136: Calling groups_plugins_play to load vars for managed_node3 12081 1726882442.55945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882442.56884: done with get_vars() 12081 1726882442.56901: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:02 -0400 (0:00:00.032) 0:01:02.372 ****** 12081 1726882442.56971: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882442.57210: worker is 1 (out of 1 available) 12081 1726882442.57222: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12081 1726882442.57236: done queuing things up, now waiting for results queue to drain 12081 1726882442.57237: waiting for pending results... 12081 1726882442.57428: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12081 1726882442.57537: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e1a 12081 1726882442.57547: variable 'ansible_search_path' from source: unknown 12081 1726882442.57551: variable 'ansible_search_path' from source: unknown 12081 1726882442.57585: calling self._execute() 12081 1726882442.57658: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882442.57662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882442.57673: variable 'omit' from source: magic vars 12081 1726882442.57937: variable 'ansible_distribution_major_version' from source: facts 12081 1726882442.57947: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882442.57953: variable 'omit' from source: magic vars 12081 1726882442.58006: variable 'omit' from source: magic vars 12081 1726882442.58120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12081 1726882442.59920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12081 1726882442.59968: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12081 1726882442.59995: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12081 1726882442.60020: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12081 1726882442.60040: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12081 1726882442.60100: variable 'network_provider' from source: set_fact 12081 1726882442.60199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12081 1726882442.60218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12081 1726882442.60235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12081 1726882442.60262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12081 1726882442.60277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12081 1726882442.60331: variable 'omit' from source: magic vars 12081 1726882442.60408: variable 'omit' from source: magic vars 12081 1726882442.60478: variable 'network_connections' from source: task vars 12081 1726882442.60488: variable 'port2_profile' from source: play vars 12081 1726882442.60532: variable 'port2_profile' from source: play vars 12081 1726882442.60540: variable 'port1_profile' from source: play vars 12081 1726882442.60585: variable 'port1_profile' from source: play vars 12081 1726882442.60592: variable 'controller_profile' from source: play vars 12081 1726882442.60637: variable 'controller_profile' from source: play vars 12081 1726882442.60750: variable 'omit' from source: magic vars 12081 1726882442.60758: variable '__lsr_ansible_managed' from source: task vars 12081 1726882442.60802: variable '__lsr_ansible_managed' from source: task vars 12081 1726882442.60931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12081 1726882442.61083: Loaded config def from plugin (lookup/template) 12081 1726882442.61086: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12081 1726882442.61107: File lookup term: get_ansible_managed.j2 12081 1726882442.61109: variable 'ansible_search_path' from source: unknown 12081 1726882442.61113: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12081 1726882442.61123: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12081 1726882442.61136: variable 'ansible_search_path' from source: unknown 12081 1726882442.64637: variable 'ansible_managed' from source: unknown 12081 1726882442.64735: variable 'omit' from source: magic vars 12081 1726882442.64757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882442.64776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882442.64789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882442.64802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882442.64810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882442.64831: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882442.64836: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882442.64838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882442.64903: Set connection var ansible_pipelining to False 12081 1726882442.64906: Set connection var ansible_shell_type to sh 12081 1726882442.64912: Set connection var ansible_shell_executable to /bin/sh 12081 1726882442.64915: Set connection var ansible_connection to ssh 12081 1726882442.64920: Set connection var ansible_timeout to 10 12081 1726882442.64924: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882442.64944: variable 'ansible_shell_executable' from source: unknown 12081 1726882442.64948: variable 'ansible_connection' from source: unknown 12081 1726882442.64951: variable 'ansible_module_compression' from source: unknown 12081 1726882442.64955: variable 'ansible_shell_type' from source: unknown 12081 1726882442.64960: variable 'ansible_shell_executable' from source: unknown 12081 1726882442.64962: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882442.64964: variable 'ansible_pipelining' from source: unknown 12081 1726882442.64966: variable 'ansible_timeout' from source: unknown 12081 1726882442.64983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882442.65059: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882442.65067: variable 'omit' from source: magic vars 12081 1726882442.65073: starting attempt loop 12081 1726882442.65077: running the handler 12081 1726882442.65089: _low_level_execute_command(): starting 12081 1726882442.65097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882442.65621: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.65640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.65669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.65682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.65725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882442.65737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882442.65747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.65865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.67504: stdout chunk (state=3): >>>/root <<< 12081 1726882442.67612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882442.67663: stderr chunk (state=3): >>><<< 12081 1726882442.67668: stdout chunk (state=3): >>><<< 12081 1726882442.67687: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882442.67699: _low_level_execute_command(): starting 12081 1726882442.67706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862 `" && echo ansible-tmp-1726882442.6768725-14878-150272446225862="` echo /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862 `" ) && sleep 0' 12081 1726882442.68157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.68167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.68193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882442.68199: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882442.68208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.68217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882442.68223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.68228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.68237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.68251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882442.68259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.68312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882442.68323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.68436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.70316: stdout chunk (state=3): >>>ansible-tmp-1726882442.6768725-14878-150272446225862=/root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862 <<< 12081 1726882442.70432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882442.70506: stderr chunk (state=3): >>><<< 12081 1726882442.70509: stdout chunk (state=3): >>><<< 12081 1726882442.70528: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882442.6768725-14878-150272446225862=/root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882442.70581: variable 'ansible_module_compression' from source: unknown 12081 1726882442.70629: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12081 1726882442.70679: variable 'ansible_facts' from source: unknown 12081 1726882442.70813: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862/AnsiballZ_network_connections.py 12081 1726882442.70961: Sending initial data 12081 1726882442.70964: Sent initial data (168 bytes) 12081 1726882442.71930: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882442.71938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.71949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.71967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.72005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882442.72012: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882442.72022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.72035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882442.72042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882442.72048: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882442.72055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.72076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.72088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.72095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882442.72102: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882442.72110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.72190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882442.72204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882442.72213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.72337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.74104: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882442.74193: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882442.74295: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpvwdrn9d8 /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862/AnsiballZ_network_connections.py <<< 12081 1726882442.74394: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882442.75908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882442.76015: stderr chunk (state=3): >>><<< 12081 1726882442.76018: stdout chunk (state=3): >>><<< 12081 1726882442.76036: done transferring module to remote 12081 1726882442.76049: _low_level_execute_command(): starting 12081 1726882442.76054: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862/ /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862/AnsiballZ_network_connections.py && sleep 0' 12081 1726882442.76516: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882442.76520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.76552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.76555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.76558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.76608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882442.76617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.76735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882442.78467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882442.78522: stderr chunk (state=3): >>><<< 12081 1726882442.78525: stdout chunk (state=3): >>><<< 12081 1726882442.78552: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882442.78558: _low_level_execute_command(): starting 12081 1726882442.78561: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862/AnsiballZ_network_connections.py && sleep 0' 12081 1726882442.79010: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.79014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882442.79058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.79062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882442.79077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882442.79113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882442.79125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882442.79242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882443.28256: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 12081 1726882443.28267: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/eeef3ef8-a51c-440e-8607-8eb479082482: error=unknown <<< 12081 1726882443.30107: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 12081 1726882443.30117: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/e88d1421-07ed-4ef2-9acc-b9d849acc275: error=unknown <<< 12081 1726882443.32050: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/2229dee4-9920-46b8-8e08-f14f29160e64: error=unknown <<< 12081 1726882443.32254: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12081 1726882443.34012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882443.34016: stderr chunk (state=3): >>><<< 12081 1726882443.34022: stdout chunk (state=3): >>><<< 12081 1726882443.34050: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/eeef3ef8-a51c-440e-8607-8eb479082482: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/e88d1421-07ed-4ef2-9acc-b9d849acc275: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_y97bnajy/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/2229dee4-9920-46b8-8e08-f14f29160e64: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882443.34102: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882443.34110: _low_level_execute_command(): starting 12081 1726882443.34115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882442.6768725-14878-150272446225862/ > /dev/null 2>&1 && sleep 0' 12081 1726882443.35858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882443.35910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882443.35921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882443.35938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882443.35978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882443.35984: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882443.35993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882443.36007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882443.36016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882443.36022: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882443.36029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882443.36041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882443.36051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882443.36060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882443.36067: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882443.36077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882443.36156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882443.36175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882443.36181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882443.36312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882443.38257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882443.38261: stdout chunk (state=3): >>><<< 12081 1726882443.38273: stderr chunk (state=3): >>><<< 12081 1726882443.38292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882443.38299: handler run complete 12081 1726882443.38331: attempt loop complete, returning result 12081 1726882443.38334: _execute() done 12081 1726882443.38337: dumping result to json 12081 1726882443.38342: done dumping result, returning 12081 1726882443.38352: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-0a3f-ff3c-000000000e1a] 12081 1726882443.38360: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1a 12081 1726882443.38482: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1a 12081 1726882443.38484: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12081 1726882443.38692: no more pending results, returning what we have 12081 1726882443.38696: results queue empty 12081 1726882443.38697: checking for any_errors_fatal 12081 1726882443.38705: done checking for any_errors_fatal 12081 1726882443.38705: checking for max_fail_percentage 12081 1726882443.38707: done checking for max_fail_percentage 12081 1726882443.38708: checking to see if all hosts have failed and the running result is not ok 12081 1726882443.38709: done checking to see if all hosts have failed 12081 1726882443.38710: getting the remaining hosts for this loop 12081 1726882443.38712: done getting the remaining hosts for this loop 12081 1726882443.38716: getting the next task for host managed_node3 12081 1726882443.38724: done getting next task for host managed_node3 12081 1726882443.38728: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882443.38733: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882443.38752: getting variables 12081 1726882443.38757: in VariableManager get_vars() 12081 1726882443.38807: Calling all_inventory to load vars for managed_node3 12081 1726882443.38809: Calling groups_inventory to load vars for managed_node3 12081 1726882443.38812: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882443.38823: Calling all_plugins_play to load vars for managed_node3 12081 1726882443.38826: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882443.38829: Calling groups_plugins_play to load vars for managed_node3 12081 1726882443.42399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882443.45607: done with get_vars() 12081 1726882443.45641: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:03 -0400 (0:00:00.887) 0:01:03.260 ****** 12081 1726882443.45741: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882443.46986: worker is 1 (out of 1 available) 12081 1726882443.46998: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12081 1726882443.47010: done queuing things up, now waiting for results queue to drain 12081 1726882443.47012: waiting for pending results... 12081 1726882443.47873: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12081 1726882443.48562: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e1b 12081 1726882443.48588: variable 'ansible_search_path' from source: unknown 12081 1726882443.48597: variable 'ansible_search_path' from source: unknown 12081 1726882443.48642: calling self._execute() 12081 1726882443.48862: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.48877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.49039: variable 'omit' from source: magic vars 12081 1726882443.49403: variable 'ansible_distribution_major_version' from source: facts 12081 1726882443.49425: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882443.49657: variable 'network_state' from source: role '' defaults 12081 1726882443.49678: Evaluated conditional (network_state != {}): False 12081 1726882443.49743: when evaluation is False, skipping this task 12081 1726882443.49750: _execute() done 12081 1726882443.49757: dumping result to json 12081 1726882443.49766: done dumping result, returning 12081 1726882443.49778: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-0a3f-ff3c-000000000e1b] 12081 1726882443.49789: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1b skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12081 1726882443.49952: no more pending results, returning what we have 12081 1726882443.49961: results queue empty 12081 1726882443.49962: checking for any_errors_fatal 12081 1726882443.49979: done checking for any_errors_fatal 12081 1726882443.49981: checking for max_fail_percentage 12081 1726882443.49983: done checking for max_fail_percentage 12081 1726882443.49984: checking to see if all hosts have failed and the running result is not ok 12081 1726882443.49985: done checking to see if all hosts have failed 12081 1726882443.49986: getting the remaining hosts for this loop 12081 1726882443.49988: done getting the remaining hosts for this loop 12081 1726882443.49992: getting the next task for host managed_node3 12081 1726882443.50001: done getting next task for host managed_node3 12081 1726882443.50006: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882443.50013: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882443.50040: getting variables 12081 1726882443.50043: in VariableManager get_vars() 12081 1726882443.50094: Calling all_inventory to load vars for managed_node3 12081 1726882443.50097: Calling groups_inventory to load vars for managed_node3 12081 1726882443.50100: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882443.50113: Calling all_plugins_play to load vars for managed_node3 12081 1726882443.50115: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882443.50117: Calling groups_plugins_play to load vars for managed_node3 12081 1726882443.50882: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1b 12081 1726882443.50886: WORKER PROCESS EXITING 12081 1726882443.52486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882443.56367: done with get_vars() 12081 1726882443.56399: done getting variables 12081 1726882443.56462: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:03 -0400 (0:00:00.107) 0:01:03.367 ****** 12081 1726882443.56500: entering _queue_task() for managed_node3/debug 12081 1726882443.56849: worker is 1 (out of 1 available) 12081 1726882443.56867: exiting _queue_task() for managed_node3/debug 12081 1726882443.56886: done queuing things up, now waiting for results queue to drain 12081 1726882443.56887: waiting for pending results... 12081 1726882443.57197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12081 1726882443.57350: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e1c 12081 1726882443.57375: variable 'ansible_search_path' from source: unknown 12081 1726882443.57381: variable 'ansible_search_path' from source: unknown 12081 1726882443.57425: calling self._execute() 12081 1726882443.57538: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.57552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.57573: variable 'omit' from source: magic vars 12081 1726882443.58008: variable 'ansible_distribution_major_version' from source: facts 12081 1726882443.58026: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882443.58035: variable 'omit' from source: magic vars 12081 1726882443.58130: variable 'omit' from source: magic vars 12081 1726882443.58172: variable 'omit' from source: magic vars 12081 1726882443.58227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882443.58272: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882443.58304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882443.58330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882443.58346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882443.58416: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882443.58430: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.58438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.58581: Set connection var ansible_pipelining to False 12081 1726882443.58590: Set connection var ansible_shell_type to sh 12081 1726882443.58601: Set connection var ansible_shell_executable to /bin/sh 12081 1726882443.58608: Set connection var ansible_connection to ssh 12081 1726882443.58616: Set connection var ansible_timeout to 10 12081 1726882443.58625: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882443.58669: variable 'ansible_shell_executable' from source: unknown 12081 1726882443.58680: variable 'ansible_connection' from source: unknown 12081 1726882443.58689: variable 'ansible_module_compression' from source: unknown 12081 1726882443.58696: variable 'ansible_shell_type' from source: unknown 12081 1726882443.58703: variable 'ansible_shell_executable' from source: unknown 12081 1726882443.58710: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.58719: variable 'ansible_pipelining' from source: unknown 12081 1726882443.58727: variable 'ansible_timeout' from source: unknown 12081 1726882443.58738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.58896: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882443.58914: variable 'omit' from source: magic vars 12081 1726882443.58923: starting attempt loop 12081 1726882443.58930: running the handler 12081 1726882443.59086: variable '__network_connections_result' from source: set_fact 12081 1726882443.59143: handler run complete 12081 1726882443.59172: attempt loop complete, returning result 12081 1726882443.59184: _execute() done 12081 1726882443.59195: dumping result to json 12081 1726882443.59215: done dumping result, returning 12081 1726882443.59230: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-000000000e1c] 12081 1726882443.59241: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1c ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 12081 1726882443.59427: no more pending results, returning what we have 12081 1726882443.59432: results queue empty 12081 1726882443.59433: checking for any_errors_fatal 12081 1726882443.59439: done checking for any_errors_fatal 12081 1726882443.59440: checking for max_fail_percentage 12081 1726882443.59442: done checking for max_fail_percentage 12081 1726882443.59443: checking to see if all hosts have failed and the running result is not ok 12081 1726882443.59444: done checking to see if all hosts have failed 12081 1726882443.59445: getting the remaining hosts for this loop 12081 1726882443.59447: done getting the remaining hosts for this loop 12081 1726882443.59451: getting the next task for host managed_node3 12081 1726882443.59466: done getting next task for host managed_node3 12081 1726882443.59471: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882443.59477: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882443.59491: getting variables 12081 1726882443.59494: in VariableManager get_vars() 12081 1726882443.59541: Calling all_inventory to load vars for managed_node3 12081 1726882443.59543: Calling groups_inventory to load vars for managed_node3 12081 1726882443.59545: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882443.59558: Calling all_plugins_play to load vars for managed_node3 12081 1726882443.59561: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882443.59565: Calling groups_plugins_play to load vars for managed_node3 12081 1726882443.61874: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1c 12081 1726882443.61878: WORKER PROCESS EXITING 12081 1726882443.63250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882443.66297: done with get_vars() 12081 1726882443.66335: done getting variables 12081 1726882443.66401: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:03 -0400 (0:00:00.099) 0:01:03.467 ****** 12081 1726882443.66447: entering _queue_task() for managed_node3/debug 12081 1726882443.67509: worker is 1 (out of 1 available) 12081 1726882443.67523: exiting _queue_task() for managed_node3/debug 12081 1726882443.67537: done queuing things up, now waiting for results queue to drain 12081 1726882443.67538: waiting for pending results... 12081 1726882443.68491: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12081 1726882443.68918: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e1d 12081 1726882443.69073: variable 'ansible_search_path' from source: unknown 12081 1726882443.69081: variable 'ansible_search_path' from source: unknown 12081 1726882443.69124: calling self._execute() 12081 1726882443.69335: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.69347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.69367: variable 'omit' from source: magic vars 12081 1726882443.70084: variable 'ansible_distribution_major_version' from source: facts 12081 1726882443.70161: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882443.70265: variable 'omit' from source: magic vars 12081 1726882443.70343: variable 'omit' from source: magic vars 12081 1726882443.70505: variable 'omit' from source: magic vars 12081 1726882443.70555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882443.70600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882443.70711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882443.70733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882443.70749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882443.70790: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882443.70913: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.70923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.71157: Set connection var ansible_pipelining to False 12081 1726882443.71167: Set connection var ansible_shell_type to sh 12081 1726882443.71180: Set connection var ansible_shell_executable to /bin/sh 12081 1726882443.71187: Set connection var ansible_connection to ssh 12081 1726882443.71197: Set connection var ansible_timeout to 10 12081 1726882443.71207: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882443.71241: variable 'ansible_shell_executable' from source: unknown 12081 1726882443.71249: variable 'ansible_connection' from source: unknown 12081 1726882443.71261: variable 'ansible_module_compression' from source: unknown 12081 1726882443.71347: variable 'ansible_shell_type' from source: unknown 12081 1726882443.71359: variable 'ansible_shell_executable' from source: unknown 12081 1726882443.71369: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.71377: variable 'ansible_pipelining' from source: unknown 12081 1726882443.71385: variable 'ansible_timeout' from source: unknown 12081 1726882443.71393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.71651: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882443.71791: variable 'omit' from source: magic vars 12081 1726882443.71802: starting attempt loop 12081 1726882443.71809: running the handler 12081 1726882443.71868: variable '__network_connections_result' from source: set_fact 12081 1726882443.71959: variable '__network_connections_result' from source: set_fact 12081 1726882443.72359: handler run complete 12081 1726882443.72394: attempt loop complete, returning result 12081 1726882443.72402: _execute() done 12081 1726882443.72408: dumping result to json 12081 1726882443.72416: done dumping result, returning 12081 1726882443.72542: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-0a3f-ff3c-000000000e1d] 12081 1726882443.72558: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1d ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12081 1726882443.72788: no more pending results, returning what we have 12081 1726882443.72793: results queue empty 12081 1726882443.72794: checking for any_errors_fatal 12081 1726882443.72802: done checking for any_errors_fatal 12081 1726882443.72803: checking for max_fail_percentage 12081 1726882443.72805: done checking for max_fail_percentage 12081 1726882443.72806: checking to see if all hosts have failed and the running result is not ok 12081 1726882443.72807: done checking to see if all hosts have failed 12081 1726882443.72808: getting the remaining hosts for this loop 12081 1726882443.72810: done getting the remaining hosts for this loop 12081 1726882443.72814: getting the next task for host managed_node3 12081 1726882443.72823: done getting next task for host managed_node3 12081 1726882443.72827: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882443.72833: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882443.72847: getting variables 12081 1726882443.72849: in VariableManager get_vars() 12081 1726882443.72903: Calling all_inventory to load vars for managed_node3 12081 1726882443.72906: Calling groups_inventory to load vars for managed_node3 12081 1726882443.72908: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882443.72929: Calling all_plugins_play to load vars for managed_node3 12081 1726882443.72932: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882443.72935: Calling groups_plugins_play to load vars for managed_node3 12081 1726882443.73972: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1d 12081 1726882443.73976: WORKER PROCESS EXITING 12081 1726882443.85395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882443.87826: done with get_vars() 12081 1726882443.87860: done getting variables 12081 1726882443.87912: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:03 -0400 (0:00:00.214) 0:01:03.682 ****** 12081 1726882443.87945: entering _queue_task() for managed_node3/debug 12081 1726882443.88985: worker is 1 (out of 1 available) 12081 1726882443.88998: exiting _queue_task() for managed_node3/debug 12081 1726882443.89010: done queuing things up, now waiting for results queue to drain 12081 1726882443.89012: waiting for pending results... 12081 1726882443.89752: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12081 1726882443.89928: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e1e 12081 1726882443.89941: variable 'ansible_search_path' from source: unknown 12081 1726882443.89947: variable 'ansible_search_path' from source: unknown 12081 1726882443.89988: calling self._execute() 12081 1726882443.90092: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882443.90096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882443.90108: variable 'omit' from source: magic vars 12081 1726882443.91090: variable 'ansible_distribution_major_version' from source: facts 12081 1726882443.91109: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882443.91281: variable 'network_state' from source: role '' defaults 12081 1726882443.91285: Evaluated conditional (network_state != {}): False 12081 1726882443.91288: when evaluation is False, skipping this task 12081 1726882443.91291: _execute() done 12081 1726882443.91294: dumping result to json 12081 1726882443.91296: done dumping result, returning 12081 1726882443.91299: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-0a3f-ff3c-000000000e1e] 12081 1726882443.91302: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1e 12081 1726882443.91745: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1e 12081 1726882443.91748: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12081 1726882443.91824: no more pending results, returning what we have 12081 1726882443.91828: results queue empty 12081 1726882443.91829: checking for any_errors_fatal 12081 1726882443.91841: done checking for any_errors_fatal 12081 1726882443.91842: checking for max_fail_percentage 12081 1726882443.91844: done checking for max_fail_percentage 12081 1726882443.91845: checking to see if all hosts have failed and the running result is not ok 12081 1726882443.91846: done checking to see if all hosts have failed 12081 1726882443.91847: getting the remaining hosts for this loop 12081 1726882443.91849: done getting the remaining hosts for this loop 12081 1726882443.91853: getting the next task for host managed_node3 12081 1726882443.91861: done getting next task for host managed_node3 12081 1726882443.91868: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882443.91876: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882443.91901: getting variables 12081 1726882443.91903: in VariableManager get_vars() 12081 1726882443.91954: Calling all_inventory to load vars for managed_node3 12081 1726882443.91957: Calling groups_inventory to load vars for managed_node3 12081 1726882443.91960: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882443.91975: Calling all_plugins_play to load vars for managed_node3 12081 1726882443.91978: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882443.91981: Calling groups_plugins_play to load vars for managed_node3 12081 1726882443.95191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882443.98468: done with get_vars() 12081 1726882443.98501: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:03 -0400 (0:00:00.106) 0:01:03.788 ****** 12081 1726882443.98597: entering _queue_task() for managed_node3/ping 12081 1726882443.98928: worker is 1 (out of 1 available) 12081 1726882443.98940: exiting _queue_task() for managed_node3/ping 12081 1726882443.98952: done queuing things up, now waiting for results queue to drain 12081 1726882443.98953: waiting for pending results... 12081 1726882444.00015: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12081 1726882444.00381: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e1f 12081 1726882444.00397: variable 'ansible_search_path' from source: unknown 12081 1726882444.00401: variable 'ansible_search_path' from source: unknown 12081 1726882444.00519: calling self._execute() 12081 1726882444.00763: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.00771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.00785: variable 'omit' from source: magic vars 12081 1726882444.01738: variable 'ansible_distribution_major_version' from source: facts 12081 1726882444.01751: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882444.01760: variable 'omit' from source: magic vars 12081 1726882444.01961: variable 'omit' from source: magic vars 12081 1726882444.01999: variable 'omit' from source: magic vars 12081 1726882444.02041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882444.02192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882444.02212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882444.02228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882444.02239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882444.02383: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882444.02386: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.02389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.02598: Set connection var ansible_pipelining to False 12081 1726882444.02602: Set connection var ansible_shell_type to sh 12081 1726882444.02609: Set connection var ansible_shell_executable to /bin/sh 12081 1726882444.02612: Set connection var ansible_connection to ssh 12081 1726882444.02617: Set connection var ansible_timeout to 10 12081 1726882444.02622: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882444.02647: variable 'ansible_shell_executable' from source: unknown 12081 1726882444.02650: variable 'ansible_connection' from source: unknown 12081 1726882444.02653: variable 'ansible_module_compression' from source: unknown 12081 1726882444.02656: variable 'ansible_shell_type' from source: unknown 12081 1726882444.02658: variable 'ansible_shell_executable' from source: unknown 12081 1726882444.02664: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.02668: variable 'ansible_pipelining' from source: unknown 12081 1726882444.02671: variable 'ansible_timeout' from source: unknown 12081 1726882444.02675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.03104: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12081 1726882444.03114: variable 'omit' from source: magic vars 12081 1726882444.03119: starting attempt loop 12081 1726882444.03122: running the handler 12081 1726882444.03248: _low_level_execute_command(): starting 12081 1726882444.03254: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882444.05813: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.05821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.05944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.05949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.06002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.06010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.06157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.06192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.06207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.06388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.08101: stdout chunk (state=3): >>>/root <<< 12081 1726882444.08270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.08274: stderr chunk (state=3): >>><<< 12081 1726882444.08289: stdout chunk (state=3): >>><<< 12081 1726882444.08308: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.08322: _low_level_execute_command(): starting 12081 1726882444.08329: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223 `" && echo ansible-tmp-1726882444.0830839-14925-197473698836223="` echo /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223 `" ) && sleep 0' 12081 1726882444.09812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.09927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.09934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.09981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.09993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.10036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882444.10042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.10198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.10212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.10218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.10348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.12243: stdout chunk (state=3): >>>ansible-tmp-1726882444.0830839-14925-197473698836223=/root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223 <<< 12081 1726882444.12421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.12425: stderr chunk (state=3): >>><<< 12081 1726882444.12428: stdout chunk (state=3): >>><<< 12081 1726882444.12452: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882444.0830839-14925-197473698836223=/root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.12507: variable 'ansible_module_compression' from source: unknown 12081 1726882444.12552: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12081 1726882444.12590: variable 'ansible_facts' from source: unknown 12081 1726882444.12678: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223/AnsiballZ_ping.py 12081 1726882444.13632: Sending initial data 12081 1726882444.13636: Sent initial data (153 bytes) 12081 1726882444.15681: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.15686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.15769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.15776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.15849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.15867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.15881: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.15896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.16069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.16093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.16111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.16245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.18000: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882444.18095: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882444.18197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpqc73_tjh /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223/AnsiballZ_ping.py <<< 12081 1726882444.18296: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882444.19921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.20004: stderr chunk (state=3): >>><<< 12081 1726882444.20007: stdout chunk (state=3): >>><<< 12081 1726882444.20028: done transferring module to remote 12081 1726882444.20039: _low_level_execute_command(): starting 12081 1726882444.20044: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223/ /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223/AnsiballZ_ping.py && sleep 0' 12081 1726882444.21701: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882444.21708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.21720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.21733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.21822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.21828: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882444.21838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.21852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882444.21880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882444.21888: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882444.21899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.21908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.21997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.22009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.22016: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.22026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.22105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.22167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.22173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.22346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.24190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.24196: stdout chunk (state=3): >>><<< 12081 1726882444.24198: stderr chunk (state=3): >>><<< 12081 1726882444.24221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.24224: _low_level_execute_command(): starting 12081 1726882444.24227: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223/AnsiballZ_ping.py && sleep 0' 12081 1726882444.26122: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882444.26129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.26139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.26153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.26200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.26207: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882444.26216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.26230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882444.26292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882444.26300: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882444.26305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.26315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.26326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.26333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.26340: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.26348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.26523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.26540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.26544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.26722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.39540: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12081 1726882444.40553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882444.40560: stdout chunk (state=3): >>><<< 12081 1726882444.40565: stderr chunk (state=3): >>><<< 12081 1726882444.40678: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882444.40683: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882444.40686: _low_level_execute_command(): starting 12081 1726882444.40688: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882444.0830839-14925-197473698836223/ > /dev/null 2>&1 && sleep 0' 12081 1726882444.41874: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882444.41892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.41911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.41929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.41985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.41998: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882444.42015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.42030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882444.42041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882444.42051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882444.42076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.42094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.42111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.42124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.42136: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.42169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.42324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.42343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.42352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.42476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.44304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.44413: stderr chunk (state=3): >>><<< 12081 1726882444.44417: stdout chunk (state=3): >>><<< 12081 1726882444.44835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.44840: handler run complete 12081 1726882444.44843: attempt loop complete, returning result 12081 1726882444.44846: _execute() done 12081 1726882444.44848: dumping result to json 12081 1726882444.44851: done dumping result, returning 12081 1726882444.44853: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-0a3f-ff3c-000000000e1f] 12081 1726882444.44856: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1f 12081 1726882444.44930: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e1f 12081 1726882444.44933: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 12081 1726882444.44997: no more pending results, returning what we have 12081 1726882444.45000: results queue empty 12081 1726882444.45001: checking for any_errors_fatal 12081 1726882444.45006: done checking for any_errors_fatal 12081 1726882444.45007: checking for max_fail_percentage 12081 1726882444.45009: done checking for max_fail_percentage 12081 1726882444.45010: checking to see if all hosts have failed and the running result is not ok 12081 1726882444.45011: done checking to see if all hosts have failed 12081 1726882444.45012: getting the remaining hosts for this loop 12081 1726882444.45013: done getting the remaining hosts for this loop 12081 1726882444.45017: getting the next task for host managed_node3 12081 1726882444.45028: done getting next task for host managed_node3 12081 1726882444.45029: ^ task is: TASK: meta (role_complete) 12081 1726882444.45035: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882444.45046: getting variables 12081 1726882444.45048: in VariableManager get_vars() 12081 1726882444.45099: Calling all_inventory to load vars for managed_node3 12081 1726882444.45103: Calling groups_inventory to load vars for managed_node3 12081 1726882444.45105: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882444.45116: Calling all_plugins_play to load vars for managed_node3 12081 1726882444.45119: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882444.45122: Calling groups_plugins_play to load vars for managed_node3 12081 1726882444.46182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882444.47116: done with get_vars() 12081 1726882444.47132: done getting variables 12081 1726882444.47195: done queuing things up, now waiting for results queue to drain 12081 1726882444.47197: results queue empty 12081 1726882444.47197: checking for any_errors_fatal 12081 1726882444.47199: done checking for any_errors_fatal 12081 1726882444.47200: checking for max_fail_percentage 12081 1726882444.47200: done checking for max_fail_percentage 12081 1726882444.47201: checking to see if all hosts have failed and the running result is not ok 12081 1726882444.47201: done checking to see if all hosts have failed 12081 1726882444.47202: getting the remaining hosts for this loop 12081 1726882444.47202: done getting the remaining hosts for this loop 12081 1726882444.47204: getting the next task for host managed_node3 12081 1726882444.47207: done getting next task for host managed_node3 12081 1726882444.47209: ^ task is: TASK: Delete the device '{{ controller_device }}' 12081 1726882444.47211: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882444.47217: getting variables 12081 1726882444.47218: in VariableManager get_vars() 12081 1726882444.47230: Calling all_inventory to load vars for managed_node3 12081 1726882444.47231: Calling groups_inventory to load vars for managed_node3 12081 1726882444.47232: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882444.47236: Calling all_plugins_play to load vars for managed_node3 12081 1726882444.47237: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882444.47239: Calling groups_plugins_play to load vars for managed_node3 12081 1726882444.47921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882444.48869: done with get_vars() 12081 1726882444.48884: done getting variables 12081 1726882444.48916: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12081 1726882444.49007: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 21:34:04 -0400 (0:00:00.504) 0:01:04.293 ****** 12081 1726882444.49032: entering _queue_task() for managed_node3/command 12081 1726882444.49281: worker is 1 (out of 1 available) 12081 1726882444.49294: exiting _queue_task() for managed_node3/command 12081 1726882444.49308: done queuing things up, now waiting for results queue to drain 12081 1726882444.49310: waiting for pending results... 12081 1726882444.49510: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 12081 1726882444.49600: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e4f 12081 1726882444.49611: variable 'ansible_search_path' from source: unknown 12081 1726882444.49614: variable 'ansible_search_path' from source: unknown 12081 1726882444.49646: calling self._execute() 12081 1726882444.49729: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.49733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.49743: variable 'omit' from source: magic vars 12081 1726882444.50021: variable 'ansible_distribution_major_version' from source: facts 12081 1726882444.50033: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882444.50038: variable 'omit' from source: magic vars 12081 1726882444.50054: variable 'omit' from source: magic vars 12081 1726882444.50124: variable 'controller_device' from source: play vars 12081 1726882444.50137: variable 'omit' from source: magic vars 12081 1726882444.50176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882444.50203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882444.50221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882444.50234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882444.50243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882444.50273: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882444.50277: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.50280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.50350: Set connection var ansible_pipelining to False 12081 1726882444.50359: Set connection var ansible_shell_type to sh 12081 1726882444.50367: Set connection var ansible_shell_executable to /bin/sh 12081 1726882444.50369: Set connection var ansible_connection to ssh 12081 1726882444.50374: Set connection var ansible_timeout to 10 12081 1726882444.50379: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882444.50400: variable 'ansible_shell_executable' from source: unknown 12081 1726882444.50404: variable 'ansible_connection' from source: unknown 12081 1726882444.50406: variable 'ansible_module_compression' from source: unknown 12081 1726882444.50409: variable 'ansible_shell_type' from source: unknown 12081 1726882444.50411: variable 'ansible_shell_executable' from source: unknown 12081 1726882444.50413: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.50417: variable 'ansible_pipelining' from source: unknown 12081 1726882444.50419: variable 'ansible_timeout' from source: unknown 12081 1726882444.50421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.50520: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882444.50528: variable 'omit' from source: magic vars 12081 1726882444.50536: starting attempt loop 12081 1726882444.50539: running the handler 12081 1726882444.50550: _low_level_execute_command(): starting 12081 1726882444.50558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882444.51094: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.51098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.51128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.51132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.51135: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.51188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.51191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.51194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.51301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.52898: stdout chunk (state=3): >>>/root <<< 12081 1726882444.52996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.53058: stderr chunk (state=3): >>><<< 12081 1726882444.53064: stdout chunk (state=3): >>><<< 12081 1726882444.53086: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.53097: _low_level_execute_command(): starting 12081 1726882444.53104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818 `" && echo ansible-tmp-1726882444.53086-14957-279666194564818="` echo /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818 `" ) && sleep 0' 12081 1726882444.53551: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.53558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.53593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.53618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.53673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.53690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.53790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.55682: stdout chunk (state=3): >>>ansible-tmp-1726882444.53086-14957-279666194564818=/root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818 <<< 12081 1726882444.55791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.55844: stderr chunk (state=3): >>><<< 12081 1726882444.55847: stdout chunk (state=3): >>><<< 12081 1726882444.55869: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882444.53086-14957-279666194564818=/root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.55896: variable 'ansible_module_compression' from source: unknown 12081 1726882444.55945: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882444.55976: variable 'ansible_facts' from source: unknown 12081 1726882444.56037: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818/AnsiballZ_command.py 12081 1726882444.56141: Sending initial data 12081 1726882444.56145: Sent initial data (154 bytes) 12081 1726882444.56831: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.56841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.56870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.56881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.56928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.56940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.57060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.58789: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 12081 1726882444.58793: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882444.58882: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882444.58985: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmpdpivxspk /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818/AnsiballZ_command.py <<< 12081 1726882444.59085: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882444.60107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.60224: stderr chunk (state=3): >>><<< 12081 1726882444.60227: stdout chunk (state=3): >>><<< 12081 1726882444.60245: done transferring module to remote 12081 1726882444.60254: _low_level_execute_command(): starting 12081 1726882444.60261: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818/ /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818/AnsiballZ_command.py && sleep 0' 12081 1726882444.60727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.60731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.60766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882444.60780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 12081 1726882444.60791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.60837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.60849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.60962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.62691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.62746: stderr chunk (state=3): >>><<< 12081 1726882444.62749: stdout chunk (state=3): >>><<< 12081 1726882444.62767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.62770: _low_level_execute_command(): starting 12081 1726882444.62776: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818/AnsiballZ_command.py && sleep 0' 12081 1726882444.63220: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.63226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.63258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882444.63274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.63284: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.63329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.63341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.63456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.77145: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:34:04.762822", "end": "2024-09-20 21:34:04.769820", "delta": "0:00:00.006998", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882444.78187: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.105 closed. <<< 12081 1726882444.78247: stderr chunk (state=3): >>><<< 12081 1726882444.78250: stdout chunk (state=3): >>><<< 12081 1726882444.78393: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:34:04.762822", "end": "2024-09-20 21:34:04.769820", "delta": "0:00:00.006998", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.105 closed. 12081 1726882444.78398: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882444.78401: _low_level_execute_command(): starting 12081 1726882444.78403: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882444.53086-14957-279666194564818/ > /dev/null 2>&1 && sleep 0' 12081 1726882444.78978: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882444.78993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.79008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.79027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.79073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.79086: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882444.79102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.79120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882444.79133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882444.79145: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882444.79157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.79175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.79197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.79211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.79222: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.79236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.79312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.79337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.79355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.79491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.81309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.81436: stderr chunk (state=3): >>><<< 12081 1726882444.81455: stdout chunk (state=3): >>><<< 12081 1726882444.81674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.81678: handler run complete 12081 1726882444.81680: Evaluated conditional (False): False 12081 1726882444.81682: Evaluated conditional (False): False 12081 1726882444.81684: attempt loop complete, returning result 12081 1726882444.81686: _execute() done 12081 1726882444.81688: dumping result to json 12081 1726882444.81690: done dumping result, returning 12081 1726882444.81692: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0e448fcc-3ce9-0a3f-ff3c-000000000e4f] 12081 1726882444.81694: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e4f 12081 1726882444.81787: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e4f 12081 1726882444.81790: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.006998", "end": "2024-09-20 21:34:04.769820", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:34:04.762822" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 12081 1726882444.81866: no more pending results, returning what we have 12081 1726882444.81871: results queue empty 12081 1726882444.81871: checking for any_errors_fatal 12081 1726882444.81873: done checking for any_errors_fatal 12081 1726882444.81874: checking for max_fail_percentage 12081 1726882444.81876: done checking for max_fail_percentage 12081 1726882444.81877: checking to see if all hosts have failed and the running result is not ok 12081 1726882444.81878: done checking to see if all hosts have failed 12081 1726882444.81879: getting the remaining hosts for this loop 12081 1726882444.81881: done getting the remaining hosts for this loop 12081 1726882444.81885: getting the next task for host managed_node3 12081 1726882444.81897: done getting next task for host managed_node3 12081 1726882444.81901: ^ task is: TASK: Remove test interfaces 12081 1726882444.81905: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882444.81910: getting variables 12081 1726882444.81912: in VariableManager get_vars() 12081 1726882444.81966: Calling all_inventory to load vars for managed_node3 12081 1726882444.81969: Calling groups_inventory to load vars for managed_node3 12081 1726882444.81972: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882444.81985: Calling all_plugins_play to load vars for managed_node3 12081 1726882444.81988: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882444.81991: Calling groups_plugins_play to load vars for managed_node3 12081 1726882444.83862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882444.85639: done with get_vars() 12081 1726882444.85672: done getting variables 12081 1726882444.85734: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:34:04 -0400 (0:00:00.367) 0:01:04.660 ****** 12081 1726882444.85771: entering _queue_task() for managed_node3/shell 12081 1726882444.86123: worker is 1 (out of 1 available) 12081 1726882444.86134: exiting _queue_task() for managed_node3/shell 12081 1726882444.86146: done queuing things up, now waiting for results queue to drain 12081 1726882444.86147: waiting for pending results... 12081 1726882444.86480: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 12081 1726882444.86628: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e55 12081 1726882444.86647: variable 'ansible_search_path' from source: unknown 12081 1726882444.86658: variable 'ansible_search_path' from source: unknown 12081 1726882444.86707: calling self._execute() 12081 1726882444.86817: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.86830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.86846: variable 'omit' from source: magic vars 12081 1726882444.87269: variable 'ansible_distribution_major_version' from source: facts 12081 1726882444.87290: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882444.87302: variable 'omit' from source: magic vars 12081 1726882444.87362: variable 'omit' from source: magic vars 12081 1726882444.87536: variable 'dhcp_interface1' from source: play vars 12081 1726882444.87548: variable 'dhcp_interface2' from source: play vars 12081 1726882444.87582: variable 'omit' from source: magic vars 12081 1726882444.87631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882444.87676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882444.87704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882444.87727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882444.87744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882444.87788: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882444.87800: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.87809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.87927: Set connection var ansible_pipelining to False 12081 1726882444.87935: Set connection var ansible_shell_type to sh 12081 1726882444.87946: Set connection var ansible_shell_executable to /bin/sh 12081 1726882444.87952: Set connection var ansible_connection to ssh 12081 1726882444.87967: Set connection var ansible_timeout to 10 12081 1726882444.87976: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882444.88004: variable 'ansible_shell_executable' from source: unknown 12081 1726882444.88016: variable 'ansible_connection' from source: unknown 12081 1726882444.88022: variable 'ansible_module_compression' from source: unknown 12081 1726882444.88028: variable 'ansible_shell_type' from source: unknown 12081 1726882444.88033: variable 'ansible_shell_executable' from source: unknown 12081 1726882444.88039: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882444.88046: variable 'ansible_pipelining' from source: unknown 12081 1726882444.88051: variable 'ansible_timeout' from source: unknown 12081 1726882444.88062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882444.88208: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882444.88226: variable 'omit' from source: magic vars 12081 1726882444.88238: starting attempt loop 12081 1726882444.88244: running the handler 12081 1726882444.88265: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882444.88292: _low_level_execute_command(): starting 12081 1726882444.88304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882444.89070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882444.89085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.89104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.89123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.89174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.89188: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882444.89202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.89223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882444.89233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882444.89243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882444.89256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.89273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.89288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.89299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.89309: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.89325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.89406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.89431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.89449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.89590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.91193: stdout chunk (state=3): >>>/root <<< 12081 1726882444.91393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.91397: stdout chunk (state=3): >>><<< 12081 1726882444.91399: stderr chunk (state=3): >>><<< 12081 1726882444.91471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.91482: _low_level_execute_command(): starting 12081 1726882444.91486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468 `" && echo ansible-tmp-1726882444.914263-14967-14033964508468="` echo /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468 `" ) && sleep 0' 12081 1726882444.92142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882444.92157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.92175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.92198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.92243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.92256: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882444.92279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.92298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882444.92315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882444.92326: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882444.92338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.92352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.92370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.92384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.92395: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.92407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.92489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.92512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.92532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.92667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.94538: stdout chunk (state=3): >>>ansible-tmp-1726882444.914263-14967-14033964508468=/root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468 <<< 12081 1726882444.94640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882444.94729: stderr chunk (state=3): >>><<< 12081 1726882444.94740: stdout chunk (state=3): >>><<< 12081 1726882444.94772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882444.914263-14967-14033964508468=/root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882444.94870: variable 'ansible_module_compression' from source: unknown 12081 1726882444.94988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882444.94991: variable 'ansible_facts' from source: unknown 12081 1726882444.95019: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468/AnsiballZ_command.py 12081 1726882444.95190: Sending initial data 12081 1726882444.95193: Sent initial data (154 bytes) 12081 1726882444.96234: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882444.96250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.96271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.96297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.96341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.96354: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882444.96372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.96391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882444.96409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882444.96421: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882444.96434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882444.96448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882444.96467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882444.96480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882444.96492: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882444.96505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882444.96587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882444.96610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882444.96632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882444.96771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882444.98515: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882444.98605: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882444.98703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp3mums66z /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468/AnsiballZ_command.py <<< 12081 1726882444.98798: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882445.00105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.00386: stderr chunk (state=3): >>><<< 12081 1726882445.00389: stdout chunk (state=3): >>><<< 12081 1726882445.00392: done transferring module to remote 12081 1726882445.00394: _low_level_execute_command(): starting 12081 1726882445.00397: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468/ /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468/AnsiballZ_command.py && sleep 0' 12081 1726882445.01019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.01032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.01046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.01076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.01120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.01132: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.01145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.01172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.01185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.01196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.01207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.01220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.01234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.01245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.01256: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882445.01272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.01354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.01379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.01403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.01534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.03295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.03391: stderr chunk (state=3): >>><<< 12081 1726882445.03400: stdout chunk (state=3): >>><<< 12081 1726882445.03512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.03515: _low_level_execute_command(): starting 12081 1726882445.03518: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468/AnsiballZ_command.py && sleep 0' 12081 1726882445.04112: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.04126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.04140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.04167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.04210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.04223: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.04237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.04255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.04275: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.04287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.04300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.04313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.04330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.04342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.04353: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882445.04390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.04467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.04495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.04513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.04652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.24695: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:34:05.176952", "end": "2024-09-20 21:34:05.245401", "delta": "0:00:00.068449", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882445.25899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882445.25955: stderr chunk (state=3): >>><<< 12081 1726882445.25958: stdout chunk (state=3): >>><<< 12081 1726882445.25980: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:34:05.176952", "end": "2024-09-20 21:34:05.245401", "delta": "0:00:00.068449", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882445.26016: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882445.26023: _low_level_execute_command(): starting 12081 1726882445.26030: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882444.914263-14967-14033964508468/ > /dev/null 2>&1 && sleep 0' 12081 1726882445.26487: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.26493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.26538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.26542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.26544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.26596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.26603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.26611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.26728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.28525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.28576: stderr chunk (state=3): >>><<< 12081 1726882445.28579: stdout chunk (state=3): >>><<< 12081 1726882445.28593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.28599: handler run complete 12081 1726882445.28619: Evaluated conditional (False): False 12081 1726882445.28627: attempt loop complete, returning result 12081 1726882445.28629: _execute() done 12081 1726882445.28632: dumping result to json 12081 1726882445.28636: done dumping result, returning 12081 1726882445.28644: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0e448fcc-3ce9-0a3f-ff3c-000000000e55] 12081 1726882445.28650: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e55 12081 1726882445.28750: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e55 12081 1726882445.28753: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.068449", "end": "2024-09-20 21:34:05.245401", "rc": 0, "start": "2024-09-20 21:34:05.176952" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 12081 1726882445.28821: no more pending results, returning what we have 12081 1726882445.28824: results queue empty 12081 1726882445.28825: checking for any_errors_fatal 12081 1726882445.28835: done checking for any_errors_fatal 12081 1726882445.28835: checking for max_fail_percentage 12081 1726882445.28837: done checking for max_fail_percentage 12081 1726882445.28838: checking to see if all hosts have failed and the running result is not ok 12081 1726882445.28839: done checking to see if all hosts have failed 12081 1726882445.28840: getting the remaining hosts for this loop 12081 1726882445.28841: done getting the remaining hosts for this loop 12081 1726882445.28845: getting the next task for host managed_node3 12081 1726882445.28852: done getting next task for host managed_node3 12081 1726882445.28855: ^ task is: TASK: Stop dnsmasq/radvd services 12081 1726882445.28858: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882445.28865: getting variables 12081 1726882445.28867: in VariableManager get_vars() 12081 1726882445.28911: Calling all_inventory to load vars for managed_node3 12081 1726882445.28914: Calling groups_inventory to load vars for managed_node3 12081 1726882445.28916: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882445.28927: Calling all_plugins_play to load vars for managed_node3 12081 1726882445.28929: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882445.28931: Calling groups_plugins_play to load vars for managed_node3 12081 1726882445.29854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882445.30799: done with get_vars() 12081 1726882445.30817: done getting variables 12081 1726882445.30867: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:34:05 -0400 (0:00:00.451) 0:01:05.111 ****** 12081 1726882445.30891: entering _queue_task() for managed_node3/shell 12081 1726882445.31207: worker is 1 (out of 1 available) 12081 1726882445.31219: exiting _queue_task() for managed_node3/shell 12081 1726882445.31231: done queuing things up, now waiting for results queue to drain 12081 1726882445.31232: waiting for pending results... 12081 1726882445.31432: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 12081 1726882445.31584: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e56 12081 1726882445.31606: variable 'ansible_search_path' from source: unknown 12081 1726882445.31620: variable 'ansible_search_path' from source: unknown 12081 1726882445.31668: calling self._execute() 12081 1726882445.31790: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882445.31803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882445.31818: variable 'omit' from source: magic vars 12081 1726882445.32232: variable 'ansible_distribution_major_version' from source: facts 12081 1726882445.32252: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882445.32274: variable 'omit' from source: magic vars 12081 1726882445.32329: variable 'omit' from source: magic vars 12081 1726882445.32376: variable 'omit' from source: magic vars 12081 1726882445.32430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882445.32474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882445.32508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882445.32530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882445.32547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882445.32587: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882445.32598: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882445.32612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882445.32736: Set connection var ansible_pipelining to False 12081 1726882445.32744: Set connection var ansible_shell_type to sh 12081 1726882445.32760: Set connection var ansible_shell_executable to /bin/sh 12081 1726882445.32772: Set connection var ansible_connection to ssh 12081 1726882445.32784: Set connection var ansible_timeout to 10 12081 1726882445.32794: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882445.32831: variable 'ansible_shell_executable' from source: unknown 12081 1726882445.32839: variable 'ansible_connection' from source: unknown 12081 1726882445.32847: variable 'ansible_module_compression' from source: unknown 12081 1726882445.32856: variable 'ansible_shell_type' from source: unknown 12081 1726882445.32866: variable 'ansible_shell_executable' from source: unknown 12081 1726882445.32875: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882445.32886: variable 'ansible_pipelining' from source: unknown 12081 1726882445.32894: variable 'ansible_timeout' from source: unknown 12081 1726882445.32901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882445.33078: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882445.33100: variable 'omit' from source: magic vars 12081 1726882445.33111: starting attempt loop 12081 1726882445.33118: running the handler 12081 1726882445.33134: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882445.33173: _low_level_execute_command(): starting 12081 1726882445.33185: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882445.33981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.33985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.34012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.34016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.34020: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.34073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.34085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.34206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.35800: stdout chunk (state=3): >>>/root <<< 12081 1726882445.35894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.35951: stderr chunk (state=3): >>><<< 12081 1726882445.35958: stdout chunk (state=3): >>><<< 12081 1726882445.35985: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.35995: _low_level_execute_command(): starting 12081 1726882445.36002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933 `" && echo ansible-tmp-1726882445.3598225-14984-151240871183933="` echo /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933 `" ) && sleep 0' 12081 1726882445.36468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.36471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.36482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.36537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882445.36540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.36542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.36546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.36597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.36611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.36618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.36713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.38581: stdout chunk (state=3): >>>ansible-tmp-1726882445.3598225-14984-151240871183933=/root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933 <<< 12081 1726882445.38693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.38751: stderr chunk (state=3): >>><<< 12081 1726882445.38757: stdout chunk (state=3): >>><<< 12081 1726882445.38780: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882445.3598225-14984-151240871183933=/root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.38806: variable 'ansible_module_compression' from source: unknown 12081 1726882445.38848: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882445.38883: variable 'ansible_facts' from source: unknown 12081 1726882445.38941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933/AnsiballZ_command.py 12081 1726882445.39051: Sending initial data 12081 1726882445.39056: Sent initial data (156 bytes) 12081 1726882445.39749: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.39758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.39786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.39799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.39849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.39871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.39980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.41704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882445.41799: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882445.41902: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp_39yfihj /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933/AnsiballZ_command.py <<< 12081 1726882445.42000: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882445.43035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.43150: stderr chunk (state=3): >>><<< 12081 1726882445.43153: stdout chunk (state=3): >>><<< 12081 1726882445.43176: done transferring module to remote 12081 1726882445.43187: _low_level_execute_command(): starting 12081 1726882445.43190: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933/ /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933/AnsiballZ_command.py && sleep 0' 12081 1726882445.43657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.43667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.43699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.43712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.43722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.43775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.43787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.43905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.45693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.45756: stderr chunk (state=3): >>><<< 12081 1726882445.45759: stdout chunk (state=3): >>><<< 12081 1726882445.45851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.45854: _low_level_execute_command(): starting 12081 1726882445.45857: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933/AnsiballZ_command.py && sleep 0' 12081 1726882445.46367: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.46382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.46396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.46415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.46453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.46470: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.46484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.46500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.46512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.46523: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.46535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.46549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.46566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.46578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.46588: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882445.46600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.46697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.46713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.46818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.61791: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:34:05.597221", "end": "2024-09-20 21:34:05.616304", "delta": "0:00:00.019083", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882445.62989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882445.63037: stderr chunk (state=3): >>><<< 12081 1726882445.63040: stdout chunk (state=3): >>><<< 12081 1726882445.63191: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:34:05.597221", "end": "2024-09-20 21:34:05.616304", "delta": "0:00:00.019083", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882445.63202: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882445.63204: _low_level_execute_command(): starting 12081 1726882445.63207: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882445.3598225-14984-151240871183933/ > /dev/null 2>&1 && sleep 0' 12081 1726882445.63810: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.63825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.63844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.63873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.63916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.63928: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.63942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.63970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.63986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.63999: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.64013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.64027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.64043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.64058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.64082: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882445.64101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.64184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.64214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.64234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.64374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.66193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.66470: stderr chunk (state=3): >>><<< 12081 1726882445.66474: stdout chunk (state=3): >>><<< 12081 1726882445.66477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.66480: handler run complete 12081 1726882445.66482: Evaluated conditional (False): False 12081 1726882445.66484: attempt loop complete, returning result 12081 1726882445.66486: _execute() done 12081 1726882445.66488: dumping result to json 12081 1726882445.66490: done dumping result, returning 12081 1726882445.66492: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0e448fcc-3ce9-0a3f-ff3c-000000000e56] 12081 1726882445.66494: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e56 12081 1726882445.66578: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e56 12081 1726882445.66581: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.019083", "end": "2024-09-20 21:34:05.616304", "rc": 0, "start": "2024-09-20 21:34:05.597221" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 12081 1726882445.66652: no more pending results, returning what we have 12081 1726882445.66659: results queue empty 12081 1726882445.66660: checking for any_errors_fatal 12081 1726882445.66675: done checking for any_errors_fatal 12081 1726882445.66676: checking for max_fail_percentage 12081 1726882445.66677: done checking for max_fail_percentage 12081 1726882445.66679: checking to see if all hosts have failed and the running result is not ok 12081 1726882445.66680: done checking to see if all hosts have failed 12081 1726882445.66681: getting the remaining hosts for this loop 12081 1726882445.66682: done getting the remaining hosts for this loop 12081 1726882445.66686: getting the next task for host managed_node3 12081 1726882445.66698: done getting next task for host managed_node3 12081 1726882445.66702: ^ task is: TASK: Check routes and DNS 12081 1726882445.66706: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882445.66714: getting variables 12081 1726882445.66716: in VariableManager get_vars() 12081 1726882445.66769: Calling all_inventory to load vars for managed_node3 12081 1726882445.66772: Calling groups_inventory to load vars for managed_node3 12081 1726882445.66774: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882445.66787: Calling all_plugins_play to load vars for managed_node3 12081 1726882445.66789: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882445.66792: Calling groups_plugins_play to load vars for managed_node3 12081 1726882445.68466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882445.70220: done with get_vars() 12081 1726882445.70253: done getting variables 12081 1726882445.70326: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:34:05 -0400 (0:00:00.394) 0:01:05.506 ****** 12081 1726882445.70366: entering _queue_task() for managed_node3/shell 12081 1726882445.70729: worker is 1 (out of 1 available) 12081 1726882445.70742: exiting _queue_task() for managed_node3/shell 12081 1726882445.70758: done queuing things up, now waiting for results queue to drain 12081 1726882445.70760: waiting for pending results... 12081 1726882445.71086: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 12081 1726882445.71223: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e5a 12081 1726882445.71245: variable 'ansible_search_path' from source: unknown 12081 1726882445.71256: variable 'ansible_search_path' from source: unknown 12081 1726882445.71303: calling self._execute() 12081 1726882445.71418: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882445.71431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882445.71444: variable 'omit' from source: magic vars 12081 1726882445.71842: variable 'ansible_distribution_major_version' from source: facts 12081 1726882445.71871: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882445.71883: variable 'omit' from source: magic vars 12081 1726882445.71940: variable 'omit' from source: magic vars 12081 1726882445.71987: variable 'omit' from source: magic vars 12081 1726882445.72033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882445.72081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882445.72107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882445.72129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882445.72146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882445.72189: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882445.72198: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882445.72208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882445.72321: Set connection var ansible_pipelining to False 12081 1726882445.72329: Set connection var ansible_shell_type to sh 12081 1726882445.72341: Set connection var ansible_shell_executable to /bin/sh 12081 1726882445.72348: Set connection var ansible_connection to ssh 12081 1726882445.72361: Set connection var ansible_timeout to 10 12081 1726882445.72375: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882445.72406: variable 'ansible_shell_executable' from source: unknown 12081 1726882445.72414: variable 'ansible_connection' from source: unknown 12081 1726882445.72422: variable 'ansible_module_compression' from source: unknown 12081 1726882445.72428: variable 'ansible_shell_type' from source: unknown 12081 1726882445.72435: variable 'ansible_shell_executable' from source: unknown 12081 1726882445.72441: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882445.72448: variable 'ansible_pipelining' from source: unknown 12081 1726882445.72457: variable 'ansible_timeout' from source: unknown 12081 1726882445.72459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882445.72664: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882445.72682: variable 'omit' from source: magic vars 12081 1726882445.72692: starting attempt loop 12081 1726882445.72698: running the handler 12081 1726882445.72712: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882445.72741: _low_level_execute_command(): starting 12081 1726882445.72757: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882445.73605: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.73609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.73641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.73645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.73647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.73702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.73706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.73815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.75454: stdout chunk (state=3): >>>/root <<< 12081 1726882445.75659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.75666: stdout chunk (state=3): >>><<< 12081 1726882445.75669: stderr chunk (state=3): >>><<< 12081 1726882445.75779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.75783: _low_level_execute_command(): starting 12081 1726882445.75786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543 `" && echo ansible-tmp-1726882445.7569401-15005-137051609697543="` echo /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543 `" ) && sleep 0' 12081 1726882445.76332: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.76346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.76367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.76394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.76432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.76446: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.76466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.76485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.76498: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.76510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.76528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.76546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.76549: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.76604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.76622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.76640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.76762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.78653: stdout chunk (state=3): >>>ansible-tmp-1726882445.7569401-15005-137051609697543=/root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543 <<< 12081 1726882445.78845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.78850: stdout chunk (state=3): >>><<< 12081 1726882445.78853: stderr chunk (state=3): >>><<< 12081 1726882445.78976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882445.7569401-15005-137051609697543=/root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.78980: variable 'ansible_module_compression' from source: unknown 12081 1726882445.79084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882445.79087: variable 'ansible_facts' from source: unknown 12081 1726882445.79122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543/AnsiballZ_command.py 12081 1726882445.79282: Sending initial data 12081 1726882445.79285: Sent initial data (156 bytes) 12081 1726882445.80430: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.80452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.80470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.80489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.80530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.80541: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.80562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.80585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.80598: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.80611: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.80622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.80636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.80651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.80673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.80687: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882445.80700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.80778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.80821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.80838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.80974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.82705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882445.82798: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882445.82896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp39114vik /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543/AnsiballZ_command.py <<< 12081 1726882445.82991: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882445.84492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.84578: stderr chunk (state=3): >>><<< 12081 1726882445.84581: stdout chunk (state=3): >>><<< 12081 1726882445.84605: done transferring module to remote 12081 1726882445.84615: _low_level_execute_command(): starting 12081 1726882445.84621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543/ /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543/AnsiballZ_command.py && sleep 0' 12081 1726882445.85327: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.85979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.85987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.86002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.86039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.86047: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.86060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.86074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.86082: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.86089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.86097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.86105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.86116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.86123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.86130: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882445.86139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.86240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.86280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.86285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.86386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882445.88217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882445.88221: stdout chunk (state=3): >>><<< 12081 1726882445.88227: stderr chunk (state=3): >>><<< 12081 1726882445.88243: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882445.88246: _low_level_execute_command(): starting 12081 1726882445.88252: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543/AnsiballZ_command.py && sleep 0' 12081 1726882445.89761: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882445.89792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.89809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.89828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.89873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.89886: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882445.89900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.89918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882445.89929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882445.89940: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882445.89952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882445.89968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882445.89985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882445.89998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882445.90009: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882445.90022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882445.90098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882445.90122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882445.90139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882445.90295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.04152: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3237sec preferred_lft 3237sec\n inet6 fe80::1017:b6ff:fe65:79c3/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:06.031407", "end": "2024-09-20 21:34:06.039952", "delta": "0:00:00.008545", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882446.05348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882446.05368: stderr chunk (state=3): >>><<< 12081 1726882446.05371: stdout chunk (state=3): >>><<< 12081 1726882446.05404: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3237sec preferred_lft 3237sec\n inet6 fe80::1017:b6ff:fe65:79c3/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:06.031407", "end": "2024-09-20 21:34:06.039952", "delta": "0:00:00.008545", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882446.05446: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882446.05451: _low_level_execute_command(): starting 12081 1726882446.05459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882445.7569401-15005-137051609697543/ > /dev/null 2>&1 && sleep 0' 12081 1726882446.06040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882446.06132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.07931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882446.08023: stderr chunk (state=3): >>><<< 12081 1726882446.08034: stdout chunk (state=3): >>><<< 12081 1726882446.08274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882446.08277: handler run complete 12081 1726882446.08280: Evaluated conditional (False): False 12081 1726882446.08282: attempt loop complete, returning result 12081 1726882446.08284: _execute() done 12081 1726882446.08286: dumping result to json 12081 1726882446.08288: done dumping result, returning 12081 1726882446.08290: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0e448fcc-3ce9-0a3f-ff3c-000000000e5a] 12081 1726882446.08292: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e5a 12081 1726882446.08371: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e5a 12081 1726882446.08375: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008545", "end": "2024-09-20 21:34:06.039952", "rc": 0, "start": "2024-09-20 21:34:06.031407" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3237sec preferred_lft 3237sec inet6 fe80::1017:b6ff:fe65:79c3/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 12081 1726882446.08459: no more pending results, returning what we have 12081 1726882446.08464: results queue empty 12081 1726882446.08465: checking for any_errors_fatal 12081 1726882446.08478: done checking for any_errors_fatal 12081 1726882446.08479: checking for max_fail_percentage 12081 1726882446.08480: done checking for max_fail_percentage 12081 1726882446.08481: checking to see if all hosts have failed and the running result is not ok 12081 1726882446.08483: done checking to see if all hosts have failed 12081 1726882446.08485: getting the remaining hosts for this loop 12081 1726882446.08486: done getting the remaining hosts for this loop 12081 1726882446.08491: getting the next task for host managed_node3 12081 1726882446.08499: done getting next task for host managed_node3 12081 1726882446.08502: ^ task is: TASK: Verify DNS and network connectivity 12081 1726882446.08506: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882446.08511: getting variables 12081 1726882446.08513: in VariableManager get_vars() 12081 1726882446.08569: Calling all_inventory to load vars for managed_node3 12081 1726882446.08573: Calling groups_inventory to load vars for managed_node3 12081 1726882446.08576: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882446.08589: Calling all_plugins_play to load vars for managed_node3 12081 1726882446.08592: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882446.08595: Calling groups_plugins_play to load vars for managed_node3 12081 1726882446.10530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882446.12371: done with get_vars() 12081 1726882446.12400: done getting variables 12081 1726882446.12468: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:34:06 -0400 (0:00:00.421) 0:01:05.927 ****** 12081 1726882446.12501: entering _queue_task() for managed_node3/shell 12081 1726882446.12865: worker is 1 (out of 1 available) 12081 1726882446.12877: exiting _queue_task() for managed_node3/shell 12081 1726882446.12894: done queuing things up, now waiting for results queue to drain 12081 1726882446.12895: waiting for pending results... 12081 1726882446.13205: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 12081 1726882446.13351: in run() - task 0e448fcc-3ce9-0a3f-ff3c-000000000e5b 12081 1726882446.13374: variable 'ansible_search_path' from source: unknown 12081 1726882446.13382: variable 'ansible_search_path' from source: unknown 12081 1726882446.13421: calling self._execute() 12081 1726882446.13532: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882446.13544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882446.13567: variable 'omit' from source: magic vars 12081 1726882446.13960: variable 'ansible_distribution_major_version' from source: facts 12081 1726882446.13981: Evaluated conditional (ansible_distribution_major_version != '6'): True 12081 1726882446.14141: variable 'ansible_facts' from source: unknown 12081 1726882446.15023: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 12081 1726882446.15038: variable 'omit' from source: magic vars 12081 1726882446.15113: variable 'omit' from source: magic vars 12081 1726882446.15148: variable 'omit' from source: magic vars 12081 1726882446.15207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12081 1726882446.15243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12081 1726882446.15271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12081 1726882446.15300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882446.15316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12081 1726882446.15347: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12081 1726882446.15360: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882446.15371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882446.15486: Set connection var ansible_pipelining to False 12081 1726882446.15493: Set connection var ansible_shell_type to sh 12081 1726882446.15511: Set connection var ansible_shell_executable to /bin/sh 12081 1726882446.15519: Set connection var ansible_connection to ssh 12081 1726882446.15528: Set connection var ansible_timeout to 10 12081 1726882446.15536: Set connection var ansible_module_compression to ZIP_DEFLATED 12081 1726882446.15571: variable 'ansible_shell_executable' from source: unknown 12081 1726882446.15579: variable 'ansible_connection' from source: unknown 12081 1726882446.15585: variable 'ansible_module_compression' from source: unknown 12081 1726882446.15591: variable 'ansible_shell_type' from source: unknown 12081 1726882446.15597: variable 'ansible_shell_executable' from source: unknown 12081 1726882446.15603: variable 'ansible_host' from source: host vars for 'managed_node3' 12081 1726882446.15614: variable 'ansible_pipelining' from source: unknown 12081 1726882446.15621: variable 'ansible_timeout' from source: unknown 12081 1726882446.15631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12081 1726882446.15787: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882446.15801: variable 'omit' from source: magic vars 12081 1726882446.15810: starting attempt loop 12081 1726882446.15816: running the handler 12081 1726882446.15835: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12081 1726882446.15868: _low_level_execute_command(): starting 12081 1726882446.15880: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12081 1726882446.16688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882446.16707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.16725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.16743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.16792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.16803: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882446.16819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.16841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882446.16852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882446.16866: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882446.16878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.16891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.16904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.16915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.16930: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882446.16946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.17027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882446.17061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882446.17083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882446.17215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.18823: stdout chunk (state=3): >>>/root <<< 12081 1726882446.18922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882446.19019: stderr chunk (state=3): >>><<< 12081 1726882446.19031: stdout chunk (state=3): >>><<< 12081 1726882446.19175: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882446.19186: _low_level_execute_command(): starting 12081 1726882446.19189: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308 `" && echo ansible-tmp-1726882446.1906943-15028-112894333459308="` echo /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308 `" ) && sleep 0' 12081 1726882446.19814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882446.19836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.19852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.19858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.19885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.19892: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882446.19902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.19915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882446.19923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882446.19933: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882446.19935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.19945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.19959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.19965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.19973: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882446.19983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.20056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882446.20074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882446.20085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882446.20220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.22087: stdout chunk (state=3): >>>ansible-tmp-1726882446.1906943-15028-112894333459308=/root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308 <<< 12081 1726882446.22284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882446.22288: stdout chunk (state=3): >>><<< 12081 1726882446.22295: stderr chunk (state=3): >>><<< 12081 1726882446.22318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882446.1906943-15028-112894333459308=/root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882446.22353: variable 'ansible_module_compression' from source: unknown 12081 1726882446.22414: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12081i6b718uh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12081 1726882446.22449: variable 'ansible_facts' from source: unknown 12081 1726882446.22533: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308/AnsiballZ_command.py 12081 1726882446.22688: Sending initial data 12081 1726882446.22691: Sent initial data (156 bytes) 12081 1726882446.23729: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882446.23741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.23752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.23770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.23811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.23817: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882446.23827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.23842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882446.23853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882446.23865: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882446.23875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.23883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.23894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.23902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.23908: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882446.23918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.23995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882446.24014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882446.24026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882446.24154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.25892: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12081 1726882446.25990: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12081 1726882446.26089: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12081i6b718uh/tmp1z34ncp0 /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308/AnsiballZ_command.py <<< 12081 1726882446.26186: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12081 1726882446.27375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882446.27469: stderr chunk (state=3): >>><<< 12081 1726882446.27472: stdout chunk (state=3): >>><<< 12081 1726882446.27495: done transferring module to remote 12081 1726882446.27505: _low_level_execute_command(): starting 12081 1726882446.27510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308/ /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308/AnsiballZ_command.py && sleep 0' 12081 1726882446.28106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.28110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.28139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 12081 1726882446.28143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.28147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.28203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882446.28208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882446.28222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882446.28337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.30093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882446.30183: stderr chunk (state=3): >>><<< 12081 1726882446.30187: stdout chunk (state=3): >>><<< 12081 1726882446.30209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882446.30213: _low_level_execute_command(): starting 12081 1726882446.30216: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308/AnsiballZ_command.py && sleep 0' 12081 1726882446.30877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882446.30883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.30895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.30910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.30956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.30959: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882446.30973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.30988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882446.30993: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882446.31000: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882446.31008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.31017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.31028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.31041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.31048: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882446.31059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.31131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882446.31152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882446.31166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882446.31299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.56404: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5754 0 --:--:-- --:--:-- --:--:-- 5865\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 6613 0 --:--:-- --:--:-- --:--:-- 6767", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:34:06.443461", "end": "2024-09-20 21:34:06.562506", "delta": "0:00:00.119045", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12081 1726882446.57732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 12081 1726882446.57736: stdout chunk (state=3): >>><<< 12081 1726882446.57739: stderr chunk (state=3): >>><<< 12081 1726882446.57903: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5754 0 --:--:-- --:--:-- --:--:-- 5865\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 6613 0 --:--:-- --:--:-- --:--:-- 6767", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:34:06.443461", "end": "2024-09-20 21:34:06.562506", "delta": "0:00:00.119045", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 12081 1726882446.57907: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12081 1726882446.57910: _low_level_execute_command(): starting 12081 1726882446.57912: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882446.1906943-15028-112894333459308/ > /dev/null 2>&1 && sleep 0' 12081 1726882446.58522: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12081 1726882446.58537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.58551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.58574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.58617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.58631: stderr chunk (state=3): >>>debug2: match not found <<< 12081 1726882446.58647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.58671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12081 1726882446.58687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 12081 1726882446.58697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12081 1726882446.58709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12081 1726882446.58721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12081 1726882446.58735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12081 1726882446.58746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 12081 1726882446.58756: stderr chunk (state=3): >>>debug2: match found <<< 12081 1726882446.58771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12081 1726882446.58850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12081 1726882446.58876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12081 1726882446.58892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12081 1726882446.59029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12081 1726882446.60847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12081 1726882446.60959: stderr chunk (state=3): >>><<< 12081 1726882446.60972: stdout chunk (state=3): >>><<< 12081 1726882446.61278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12081 1726882446.61283: handler run complete 12081 1726882446.61285: Evaluated conditional (False): False 12081 1726882446.61287: attempt loop complete, returning result 12081 1726882446.61289: _execute() done 12081 1726882446.61291: dumping result to json 12081 1726882446.61293: done dumping result, returning 12081 1726882446.61295: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-0a3f-ff3c-000000000e5b] 12081 1726882446.61297: sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e5b 12081 1726882446.61373: done sending task result for task 0e448fcc-3ce9-0a3f-ff3c-000000000e5b 12081 1726882446.61378: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.119045", "end": "2024-09-20 21:34:06.562506", "rc": 0, "start": "2024-09-20 21:34:06.443461" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 5754 0 --:--:-- --:--:-- --:--:-- 5865 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 6613 0 --:--:-- --:--:-- --:--:-- 6767 12081 1726882446.61452: no more pending results, returning what we have 12081 1726882446.61455: results queue empty 12081 1726882446.61456: checking for any_errors_fatal 12081 1726882446.61470: done checking for any_errors_fatal 12081 1726882446.61470: checking for max_fail_percentage 12081 1726882446.61472: done checking for max_fail_percentage 12081 1726882446.61473: checking to see if all hosts have failed and the running result is not ok 12081 1726882446.61475: done checking to see if all hosts have failed 12081 1726882446.61475: getting the remaining hosts for this loop 12081 1726882446.61477: done getting the remaining hosts for this loop 12081 1726882446.61481: getting the next task for host managed_node3 12081 1726882446.61491: done getting next task for host managed_node3 12081 1726882446.61493: ^ task is: TASK: meta (flush_handlers) 12081 1726882446.61495: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882446.61500: getting variables 12081 1726882446.61502: in VariableManager get_vars() 12081 1726882446.61551: Calling all_inventory to load vars for managed_node3 12081 1726882446.61554: Calling groups_inventory to load vars for managed_node3 12081 1726882446.61556: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882446.61571: Calling all_plugins_play to load vars for managed_node3 12081 1726882446.61575: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882446.61578: Calling groups_plugins_play to load vars for managed_node3 12081 1726882446.63314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882446.65073: done with get_vars() 12081 1726882446.65107: done getting variables 12081 1726882446.65184: in VariableManager get_vars() 12081 1726882446.65203: Calling all_inventory to load vars for managed_node3 12081 1726882446.65205: Calling groups_inventory to load vars for managed_node3 12081 1726882446.65207: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882446.65212: Calling all_plugins_play to load vars for managed_node3 12081 1726882446.65215: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882446.65217: Calling groups_plugins_play to load vars for managed_node3 12081 1726882446.66645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882446.68373: done with get_vars() 12081 1726882446.68416: done queuing things up, now waiting for results queue to drain 12081 1726882446.68419: results queue empty 12081 1726882446.68419: checking for any_errors_fatal 12081 1726882446.68424: done checking for any_errors_fatal 12081 1726882446.68425: checking for max_fail_percentage 12081 1726882446.68427: done checking for max_fail_percentage 12081 1726882446.68427: checking to see if all hosts have failed and the running result is not ok 12081 1726882446.68428: done checking to see if all hosts have failed 12081 1726882446.68429: getting the remaining hosts for this loop 12081 1726882446.68430: done getting the remaining hosts for this loop 12081 1726882446.68433: getting the next task for host managed_node3 12081 1726882446.68437: done getting next task for host managed_node3 12081 1726882446.68439: ^ task is: TASK: meta (flush_handlers) 12081 1726882446.68440: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882446.68448: getting variables 12081 1726882446.68449: in VariableManager get_vars() 12081 1726882446.68474: Calling all_inventory to load vars for managed_node3 12081 1726882446.68476: Calling groups_inventory to load vars for managed_node3 12081 1726882446.68479: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882446.68485: Calling all_plugins_play to load vars for managed_node3 12081 1726882446.68487: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882446.68490: Calling groups_plugins_play to load vars for managed_node3 12081 1726882446.69780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882446.71528: done with get_vars() 12081 1726882446.71551: done getting variables 12081 1726882446.71606: in VariableManager get_vars() 12081 1726882446.71623: Calling all_inventory to load vars for managed_node3 12081 1726882446.71626: Calling groups_inventory to load vars for managed_node3 12081 1726882446.71628: Calling all_plugins_inventory to load vars for managed_node3 12081 1726882446.71632: Calling all_plugins_play to load vars for managed_node3 12081 1726882446.71635: Calling groups_plugins_inventory to load vars for managed_node3 12081 1726882446.71637: Calling groups_plugins_play to load vars for managed_node3 12081 1726882446.72883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12081 1726882446.74635: done with get_vars() 12081 1726882446.74677: done queuing things up, now waiting for results queue to drain 12081 1726882446.74679: results queue empty 12081 1726882446.74679: checking for any_errors_fatal 12081 1726882446.74681: done checking for any_errors_fatal 12081 1726882446.74681: checking for max_fail_percentage 12081 1726882446.74682: done checking for max_fail_percentage 12081 1726882446.74683: checking to see if all hosts have failed and the running result is not ok 12081 1726882446.74684: done checking to see if all hosts have failed 12081 1726882446.74684: getting the remaining hosts for this loop 12081 1726882446.74685: done getting the remaining hosts for this loop 12081 1726882446.74688: getting the next task for host managed_node3 12081 1726882446.74691: done getting next task for host managed_node3 12081 1726882446.74691: ^ task is: None 12081 1726882446.74693: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12081 1726882446.74694: done queuing things up, now waiting for results queue to drain 12081 1726882446.74695: results queue empty 12081 1726882446.74695: checking for any_errors_fatal 12081 1726882446.74696: done checking for any_errors_fatal 12081 1726882446.74696: checking for max_fail_percentage 12081 1726882446.74697: done checking for max_fail_percentage 12081 1726882446.74698: checking to see if all hosts have failed and the running result is not ok 12081 1726882446.74698: done checking to see if all hosts have failed 12081 1726882446.74700: getting the next task for host managed_node3 12081 1726882446.74702: done getting next task for host managed_node3 12081 1726882446.74703: ^ task is: None 12081 1726882446.74704: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=148 changed=4 unreachable=0 failed=0 skipped=97 rescued=0 ignored=0 Friday 20 September 2024 21:34:06 -0400 (0:00:00.622) 0:01:06.550 ****** =============================================================================== ** TEST check bond settings --------------------------------------------- 7.81s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 ** TEST check bond settings --------------------------------------------- 2.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.71s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.67s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.62s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.61s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Create test interfaces -------------------------------------------------- 1.55s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Install dnsmasq --------------------------------------------------------- 1.55s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Install pgrep, sysctl --------------------------------------------------- 1.42s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Install pgrep, sysctl --------------------------------------------------- 1.34s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Install dnsmasq --------------------------------------------------------- 1.34s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Gathering Facts --------------------------------------------------------- 1.28s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.23s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.89s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.89s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 12081 1726882446.74862: RUNNING CLEANUP