[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 32980 1727096587.60399: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 32980 1727096587.60839: Added group all to inventory 32980 1727096587.60841: Added group ungrouped to inventory 32980 1727096587.60845: Group all now contains ungrouped 32980 1727096587.60848: Examining possible inventory source: /tmp/network-EuO/inventory.yml 32980 1727096587.80222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 32980 1727096587.80329: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 32980 1727096587.80362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 32980 1727096587.80434: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 32980 1727096587.80554: Loaded config def from plugin (inventory/script) 32980 1727096587.80556: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 32980 1727096587.80646: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 32980 1727096587.80753: Loaded config def from plugin (inventory/yaml) 32980 1727096587.80757: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 32980 1727096587.80974: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 32980 1727096587.81895: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 32980 1727096587.81898: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 32980 1727096587.81901: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 32980 1727096587.81906: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 32980 1727096587.81910: Loading data from /tmp/network-EuO/inventory.yml 32980 1727096587.81973: /tmp/network-EuO/inventory.yml was not parsable by auto 32980 1727096587.82040: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 32980 1727096587.82107: Loading data from /tmp/network-EuO/inventory.yml 32980 1727096587.82236: group all already in inventory 32980 1727096587.82244: set inventory_file for managed_node1 32980 1727096587.82268: set inventory_dir for managed_node1 32980 1727096587.82270: Added host managed_node1 to inventory 32980 1727096587.82273: Added host managed_node1 to group all 32980 1727096587.82274: set ansible_host for managed_node1 32980 1727096587.82274: set ansible_ssh_extra_args for managed_node1 32980 1727096587.82286: set inventory_file for managed_node2 32980 1727096587.82290: set inventory_dir for managed_node2 32980 1727096587.82291: Added host managed_node2 to inventory 32980 1727096587.82292: Added host managed_node2 to group all 32980 1727096587.82293: set ansible_host for managed_node2 32980 1727096587.82294: set ansible_ssh_extra_args for managed_node2 32980 1727096587.82297: set inventory_file for managed_node3 32980 1727096587.82300: set inventory_dir for managed_node3 32980 1727096587.82301: Added host managed_node3 to inventory 32980 1727096587.82304: Added host managed_node3 to group all 32980 1727096587.82305: set ansible_host for managed_node3 32980 1727096587.82306: set ansible_ssh_extra_args for managed_node3 32980 1727096587.82308: Reconcile groups and hosts in inventory. 32980 1727096587.82312: Group ungrouped now contains managed_node1 32980 1727096587.82317: Group ungrouped now contains managed_node2 32980 1727096587.82319: Group ungrouped now contains managed_node3 32980 1727096587.82438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 32980 1727096587.82681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 32980 1727096587.82755: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 32980 1727096587.82815: Loaded config def from plugin (vars/host_group_vars) 32980 1727096587.82818: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 32980 1727096587.82830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 32980 1727096587.82863: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 32980 1727096587.82946: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 32980 1727096587.83374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096587.83482: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 32980 1727096587.83561: Loaded config def from plugin (connection/local) 32980 1727096587.83565: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 32980 1727096587.84475: Loaded config def from plugin (connection/paramiko_ssh) 32980 1727096587.84479: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 32980 1727096587.85782: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32980 1727096587.85823: Loaded config def from plugin (connection/psrp) 32980 1727096587.85826: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 32980 1727096587.86540: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32980 1727096587.86583: Loaded config def from plugin (connection/ssh) 32980 1727096587.86586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 32980 1727096587.89686: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32980 1727096587.89728: Loaded config def from plugin (connection/winrm) 32980 1727096587.89732: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 32980 1727096587.89764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 32980 1727096587.89832: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 32980 1727096587.89902: Loaded config def from plugin (shell/cmd) 32980 1727096587.89904: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 32980 1727096587.89931: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 32980 1727096587.89996: Loaded config def from plugin (shell/powershell) 32980 1727096587.89998: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 32980 1727096587.90058: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 32980 1727096587.90257: Loaded config def from plugin (shell/sh) 32980 1727096587.90259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 32980 1727096587.90297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 32980 1727096587.90415: Loaded config def from plugin (become/runas) 32980 1727096587.90417: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 32980 1727096587.90665: Loaded config def from plugin (become/su) 32980 1727096587.90668: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 32980 1727096587.90841: Loaded config def from plugin (become/sudo) 32980 1727096587.90843: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 32980 1727096587.90963: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 32980 1727096587.91355: in VariableManager get_vars() 32980 1727096587.91382: done with get_vars() 32980 1727096587.91515: trying /usr/local/lib/python3.12/site-packages/ansible/modules 32980 1727096587.94706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 32980 1727096587.94937: in VariableManager get_vars() 32980 1727096587.94944: done with get_vars() 32980 1727096587.94947: variable 'playbook_dir' from source: magic vars 32980 1727096587.94948: variable 'ansible_playbook_python' from source: magic vars 32980 1727096587.94949: variable 'ansible_config_file' from source: magic vars 32980 1727096587.94949: variable 'groups' from source: magic vars 32980 1727096587.94950: variable 'omit' from source: magic vars 32980 1727096587.94951: variable 'ansible_version' from source: magic vars 32980 1727096587.94952: variable 'ansible_check_mode' from source: magic vars 32980 1727096587.94952: variable 'ansible_diff_mode' from source: magic vars 32980 1727096587.94953: variable 'ansible_forks' from source: magic vars 32980 1727096587.94954: variable 'ansible_inventory_sources' from source: magic vars 32980 1727096587.94954: variable 'ansible_skip_tags' from source: magic vars 32980 1727096587.94955: variable 'ansible_limit' from source: magic vars 32980 1727096587.94956: variable 'ansible_run_tags' from source: magic vars 32980 1727096587.94956: variable 'ansible_verbosity' from source: magic vars 32980 1727096587.94995: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml 32980 1727096587.95495: in VariableManager get_vars() 32980 1727096587.95513: done with get_vars() 32980 1727096587.95542: in VariableManager get_vars() 32980 1727096587.95550: done with get_vars() 32980 1727096587.95576: in VariableManager get_vars() 32980 1727096587.95595: done with get_vars() 32980 1727096587.95688: in VariableManager get_vars() 32980 1727096587.95698: done with get_vars() 32980 1727096587.95701: variable 'omit' from source: magic vars 32980 1727096587.95714: variable 'omit' from source: magic vars 32980 1727096587.95734: in VariableManager get_vars() 32980 1727096587.95741: done with get_vars() 32980 1727096587.95775: in VariableManager get_vars() 32980 1727096587.95784: done with get_vars() 32980 1727096587.95809: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 32980 1727096587.95937: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 32980 1727096587.96075: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 32980 1727096587.96636: in VariableManager get_vars() 32980 1727096587.96664: done with get_vars() 32980 1727096587.97146: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 32980 1727096587.97274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32980 1727096587.99857: in VariableManager get_vars() 32980 1727096587.99872: done with get_vars() 32980 1727096587.99899: in VariableManager get_vars() 32980 1727096587.99919: done with get_vars() 32980 1727096587.99992: in VariableManager get_vars() 32980 1727096588.00003: done with get_vars() 32980 1727096588.00006: variable 'omit' from source: magic vars 32980 1727096588.00013: variable 'omit' from source: magic vars 32980 1727096588.00031: in VariableManager get_vars() 32980 1727096588.00039: done with get_vars() 32980 1727096588.00051: in VariableManager get_vars() 32980 1727096588.00060: done with get_vars() 32980 1727096588.00084: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 32980 1727096588.00143: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 32980 1727096588.00192: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 32980 1727096588.00483: in VariableManager get_vars() 32980 1727096588.00497: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32980 1727096588.02836: in VariableManager get_vars() 32980 1727096588.02858: done with get_vars() 32980 1727096588.02900: in VariableManager get_vars() 32980 1727096588.02919: done with get_vars() 32980 1727096588.03186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 32980 1727096588.03202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 32980 1727096588.05550: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 32980 1727096588.05933: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 32980 1727096588.05937: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 32980 1727096588.05991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 32980 1727096588.06016: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 32980 1727096588.06187: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 32980 1727096588.06311: Loaded config def from plugin (callback/default) 32980 1727096588.06314: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32980 1727096588.07156: Loaded config def from plugin (callback/junit) 32980 1727096588.07158: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32980 1727096588.07193: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 32980 1727096588.07230: Loaded config def from plugin (callback/minimal) 32980 1727096588.07232: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32980 1727096588.07256: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32980 1727096588.07299: Loaded config def from plugin (callback/tree) 32980 1727096588.07301: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 32980 1727096588.07372: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 32980 1727096588.07376: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_vlan_mtu_nm.yml ************************************************ 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 32980 1727096588.07396: in VariableManager get_vars() 32980 1727096588.07406: done with get_vars() 32980 1727096588.07410: in VariableManager get_vars() 32980 1727096588.07415: done with get_vars() 32980 1727096588.07417: variable 'omit' from source: magic vars 32980 1727096588.07439: in VariableManager get_vars() 32980 1727096588.07447: done with get_vars() 32980 1727096588.07461: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_vlan_mtu.yml' with nm as provider] ********* 32980 1727096588.07924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 32980 1727096588.07997: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 32980 1727096588.08040: getting the remaining hosts for this loop 32980 1727096588.08042: done getting the remaining hosts for this loop 32980 1727096588.08045: getting the next task for host managed_node2 32980 1727096588.08049: done getting next task for host managed_node2 32980 1727096588.08051: ^ task is: TASK: Gathering Facts 32980 1727096588.08053: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096588.08055: getting variables 32980 1727096588.08056: in VariableManager get_vars() 32980 1727096588.08066: Calling all_inventory to load vars for managed_node2 32980 1727096588.08071: Calling groups_inventory to load vars for managed_node2 32980 1727096588.08074: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096588.08086: Calling all_plugins_play to load vars for managed_node2 32980 1727096588.08097: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096588.08101: Calling groups_plugins_play to load vars for managed_node2 32980 1727096588.08134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096588.08188: done with get_vars() 32980 1727096588.08194: done getting variables 32980 1727096588.08254: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Monday 23 September 2024 09:03:08 -0400 (0:00:00.009) 0:00:00.009 ****** 32980 1727096588.08276: entering _queue_task() for managed_node2/gather_facts 32980 1727096588.08277: Creating lock for gather_facts 32980 1727096588.08611: worker is 1 (out of 1 available) 32980 1727096588.08621: exiting _queue_task() for managed_node2/gather_facts 32980 1727096588.08634: done queuing things up, now waiting for results queue to drain 32980 1727096588.08635: waiting for pending results... 32980 1727096588.08855: running TaskExecutor() for managed_node2/TASK: Gathering Facts 32980 1727096588.08916: in run() - task 0afff68d-5257-457d-ef33-0000000000af 32980 1727096588.08926: variable 'ansible_search_path' from source: unknown 32980 1727096588.08955: calling self._execute() 32980 1727096588.09002: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096588.09005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096588.09013: variable 'omit' from source: magic vars 32980 1727096588.09086: variable 'omit' from source: magic vars 32980 1727096588.09106: variable 'omit' from source: magic vars 32980 1727096588.09130: variable 'omit' from source: magic vars 32980 1727096588.09168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096588.09196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096588.09212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096588.09225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096588.09234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096588.09261: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096588.09264: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096588.09268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096588.09336: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096588.09339: Set connection var ansible_timeout to 10 32980 1727096588.09341: Set connection var ansible_shell_type to sh 32980 1727096588.09344: Set connection var ansible_connection to ssh 32980 1727096588.09352: Set connection var ansible_shell_executable to /bin/sh 32980 1727096588.09355: Set connection var ansible_pipelining to False 32980 1727096588.09377: variable 'ansible_shell_executable' from source: unknown 32980 1727096588.09380: variable 'ansible_connection' from source: unknown 32980 1727096588.09383: variable 'ansible_module_compression' from source: unknown 32980 1727096588.09385: variable 'ansible_shell_type' from source: unknown 32980 1727096588.09387: variable 'ansible_shell_executable' from source: unknown 32980 1727096588.09390: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096588.09392: variable 'ansible_pipelining' from source: unknown 32980 1727096588.09394: variable 'ansible_timeout' from source: unknown 32980 1727096588.09396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096588.09550: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096588.09558: variable 'omit' from source: magic vars 32980 1727096588.09563: starting attempt loop 32980 1727096588.09565: running the handler 32980 1727096588.09583: variable 'ansible_facts' from source: unknown 32980 1727096588.09595: _low_level_execute_command(): starting 32980 1727096588.09602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096588.10091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096588.10095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.10097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096588.10100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.10152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096588.10155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096588.10159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096588.10198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096588.11877: stdout chunk (state=3): >>>/root <<< 32980 1727096588.11972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096588.12009: stderr chunk (state=3): >>><<< 32980 1727096588.12011: stdout chunk (state=3): >>><<< 32980 1727096588.12024: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096588.12052: _low_level_execute_command(): starting 32980 1727096588.12056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404 `" && echo ansible-tmp-1727096588.1202826-33019-76373297743404="` echo /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404 `" ) && sleep 0' 32980 1727096588.12473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096588.12477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096588.12480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096588.12482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096588.12491: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.12545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096588.12550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096588.12552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096588.12577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096588.14533: stdout chunk (state=3): >>>ansible-tmp-1727096588.1202826-33019-76373297743404=/root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404 <<< 32980 1727096588.14700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096588.14707: stdout chunk (state=3): >>><<< 32980 1727096588.14709: stderr chunk (state=3): >>><<< 32980 1727096588.14776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096588.1202826-33019-76373297743404=/root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096588.14780: variable 'ansible_module_compression' from source: unknown 32980 1727096588.14848: ANSIBALLZ: Using generic lock for ansible.legacy.setup 32980 1727096588.14851: ANSIBALLZ: Acquiring lock 32980 1727096588.14854: ANSIBALLZ: Lock acquired: 140258569802416 32980 1727096588.14856: ANSIBALLZ: Creating module 32980 1727096588.36713: ANSIBALLZ: Writing module into payload 32980 1727096588.36802: ANSIBALLZ: Writing module 32980 1727096588.36820: ANSIBALLZ: Renaming module 32980 1727096588.36825: ANSIBALLZ: Done creating module 32980 1727096588.36852: variable 'ansible_facts' from source: unknown 32980 1727096588.36858: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096588.36865: _low_level_execute_command(): starting 32980 1727096588.36875: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 32980 1727096588.37332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096588.37337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096588.37340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.37342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096588.37344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.37400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096588.37403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096588.37406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096588.37450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096588.39131: stdout chunk (state=3): >>>PLATFORM <<< 32980 1727096588.39225: stdout chunk (state=3): >>>Linux <<< 32980 1727096588.39230: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 32980 1727096588.39236: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 32980 1727096588.39390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096588.39454: stderr chunk (state=3): >>><<< 32980 1727096588.39461: stdout chunk (state=3): >>><<< 32980 1727096588.39505: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096588.39635 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 32980 1727096588.39639: _low_level_execute_command(): starting 32980 1727096588.39641: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 32980 1727096588.39696: Sending initial data 32980 1727096588.39699: Sent initial data (1181 bytes) 32980 1727096588.40195: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.40198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096588.40204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.40258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096588.40263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096588.40265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096588.40298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096588.43723: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 32980 1727096588.44180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096588.44184: stdout chunk (state=3): >>><<< 32980 1727096588.44186: stderr chunk (state=3): >>><<< 32980 1727096588.44189: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096588.44265: variable 'ansible_facts' from source: unknown 32980 1727096588.44284: variable 'ansible_facts' from source: unknown 32980 1727096588.44303: variable 'ansible_module_compression' from source: unknown 32980 1727096588.44348: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32980 1727096588.44472: variable 'ansible_facts' from source: unknown 32980 1727096588.44549: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/AnsiballZ_setup.py 32980 1727096588.44719: Sending initial data 32980 1727096588.44729: Sent initial data (153 bytes) 32980 1727096588.45365: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096588.45477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.45490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096588.45506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096588.45528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096588.45598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096588.47187: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096588.47236: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096588.47257: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp4y1huwfd /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/AnsiballZ_setup.py <<< 32980 1727096588.47275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/AnsiballZ_setup.py" <<< 32980 1727096588.47294: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp4y1huwfd" to remote "/root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/AnsiballZ_setup.py" <<< 32980 1727096588.47300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/AnsiballZ_setup.py" <<< 32980 1727096588.48308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096588.48489: stderr chunk (state=3): >>><<< 32980 1727096588.48492: stdout chunk (state=3): >>><<< 32980 1727096588.48494: done transferring module to remote 32980 1727096588.48496: _low_level_execute_command(): starting 32980 1727096588.48498: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/ /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/AnsiballZ_setup.py && sleep 0' 32980 1727096588.49028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096588.49041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096588.49057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.49099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096588.49129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096588.49148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096588.50917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096588.50946: stderr chunk (state=3): >>><<< 32980 1727096588.50950: stdout chunk (state=3): >>><<< 32980 1727096588.50959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096588.50962: _low_level_execute_command(): starting 32980 1727096588.50966: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/AnsiballZ_setup.py && sleep 0' 32980 1727096588.51531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096588.51534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096588.51544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096588.51565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096588.51586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096588.51598: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096588.51612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096588.51631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096588.51643: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096588.51655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096588.51672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096588.51756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096588.51783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096588.51852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096588.54004: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 32980 1727096588.54028: stdout chunk (state=3): >>>import _imp # builtin <<< 32980 1727096588.54068: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 32980 1727096588.54081: stdout chunk (state=3): >>>import '_weakref' # <<< 32980 1727096588.54141: stdout chunk (state=3): >>>import '_io' # <<< 32980 1727096588.54153: stdout chunk (state=3): >>>import 'marshal' # <<< 32980 1727096588.54181: stdout chunk (state=3): >>>import 'posix' # <<< 32980 1727096588.54217: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 32980 1727096588.54252: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 32980 1727096588.54255: stdout chunk (state=3): >>># installed zipimport hook <<< 32980 1727096588.54309: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 32980 1727096588.54312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.54345: stdout chunk (state=3): >>>import '_codecs' # <<< 32980 1727096588.54348: stdout chunk (state=3): >>>import 'codecs' # <<< 32980 1727096588.54392: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 32980 1727096588.54415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4733104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4732dfb30> <<< 32980 1727096588.54453: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 32980 1727096588.54472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473312a50> <<< 32980 1727096588.54507: stdout chunk (state=3): >>>import '_signal' # <<< 32980 1727096588.54510: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 32980 1727096588.54533: stdout chunk (state=3): >>>import 'io' # <<< 32980 1727096588.54562: stdout chunk (state=3): >>>import '_stat' # <<< 32980 1727096588.54576: stdout chunk (state=3): >>>import 'stat' # <<< 32980 1727096588.54641: stdout chunk (state=3): >>>import '_collections_abc' # <<< 32980 1727096588.54676: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 32980 1727096588.54721: stdout chunk (state=3): >>>import 'os' # <<< 32980 1727096588.54745: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 32980 1727096588.54760: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 32980 1727096588.54783: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 32980 1727096588.54790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 32980 1727096588.54809: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473105130> <<< 32980 1727096588.54861: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 32980 1727096588.54872: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.54881: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473105fa0> <<< 32980 1727096588.54911: stdout chunk (state=3): >>>import 'site' # <<< 32980 1727096588.54934: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 32980 1727096588.55320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 32980 1727096588.55323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 32980 1727096588.55352: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.55387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 32980 1727096588.55413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 32980 1727096588.55433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 32980 1727096588.55481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 32980 1727096588.55484: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473143da0> <<< 32980 1727096588.55504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 32980 1727096588.55554: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473143fe0> <<< 32980 1727096588.55560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 32980 1727096588.55602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 32980 1727096588.55605: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 32980 1727096588.55721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47317b770> <<< 32980 1727096588.55751: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47317be00> import '_collections' # <<< 32980 1727096588.55794: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47315ba40> <<< 32980 1727096588.55799: stdout chunk (state=3): >>>import '_functools' # <<< 32980 1727096588.55832: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473159190> <<< 32980 1727096588.55918: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473140f50> <<< 32980 1727096588.55939: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 32980 1727096588.55966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 32980 1727096588.55974: stdout chunk (state=3): >>>import '_sre' # <<< 32980 1727096588.55999: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 32980 1727096588.56018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 32980 1727096588.56039: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 32980 1727096588.56082: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47319b710> <<< 32980 1727096588.56085: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47319a330> <<< 32980 1727096588.56124: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47315a030> <<< 32980 1727096588.56128: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473198b30> <<< 32980 1727096588.56180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 32980 1727096588.56183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d07a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731401d0> <<< 32980 1727096588.56215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 32980 1727096588.56251: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.56257: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731d0c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d0b00> <<< 32980 1727096588.56300: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.56304: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731d0ec0> <<< 32980 1727096588.56307: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47313ecf0> <<< 32980 1727096588.56334: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 32980 1727096588.56338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.56354: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 32980 1727096588.56401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 32980 1727096588.56404: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d15b0> <<< 32980 1727096588.56407: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d1280> import 'importlib.machinery' # <<< 32980 1727096588.56441: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 32980 1727096588.56462: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d24b0> <<< 32980 1727096588.56472: stdout chunk (state=3): >>>import 'importlib.util' # <<< 32980 1727096588.56488: stdout chunk (state=3): >>>import 'runpy' # <<< 32980 1727096588.56504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 32980 1727096588.56537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 32980 1727096588.56560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 32980 1727096588.56572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731e86e0> <<< 32980 1727096588.56580: stdout chunk (state=3): >>>import 'errno' # <<< 32980 1727096588.56610: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.56614: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731e9df0> <<< 32980 1727096588.56638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 32980 1727096588.56643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 32980 1727096588.56674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 32980 1727096588.56687: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731eac60> <<< 32980 1727096588.56725: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.56728: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731eb2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731ea1b0> <<< 32980 1727096588.56755: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 32980 1727096588.56766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 32980 1727096588.56803: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.56817: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731ebd40> <<< 32980 1727096588.56820: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731eb470> <<< 32980 1727096588.56859: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d2510> <<< 32980 1727096588.56884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 32980 1727096588.56904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 32980 1727096588.56925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 32980 1727096588.56944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 32980 1727096588.56978: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472ee7c50> <<< 32980 1727096588.57000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 32980 1727096588.57033: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f107a0> <<< 32980 1727096588.57037: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f10500> <<< 32980 1727096588.57062: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.57069: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f107d0> <<< 32980 1727096588.57096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 32980 1727096588.57103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 32980 1727096588.57174: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.57294: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.57300: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f11100> <<< 32980 1727096588.57423: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.57428: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f11af0> <<< 32980 1727096588.57436: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f109b0> <<< 32980 1727096588.57441: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472ee5df0> <<< 32980 1727096588.57466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 32980 1727096588.57490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 32980 1727096588.57511: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 32980 1727096588.57523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 32980 1727096588.57531: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f12ed0> <<< 32980 1727096588.57554: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f11c40> <<< 32980 1727096588.57571: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d2c00> <<< 32980 1727096588.57595: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 32980 1727096588.57683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 32980 1727096588.57714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 32980 1727096588.57739: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f3b200> <<< 32980 1727096588.57814: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 32980 1727096588.57818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.57845: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 32980 1727096588.57901: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f5f5f0> <<< 32980 1727096588.57913: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 32980 1727096588.57953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 32980 1727096588.58004: stdout chunk (state=3): >>>import 'ntpath' # <<< 32980 1727096588.58039: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 32980 1727096588.58069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472fc03e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 32980 1727096588.58097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 32980 1727096588.58107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 32980 1727096588.58139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 32980 1727096588.58225: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472fc2b40> <<< 32980 1727096588.58336: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472fc0500> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f853d0> <<< 32980 1727096588.58388: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472dc1430> <<< 32980 1727096588.58406: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f5e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f13dd0> <<< 32980 1727096588.58555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 32980 1727096588.58574: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa472f5e9f0> <<< 32980 1727096588.58811: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_080v21st/ansible_ansible.legacy.setup_payload.zip' <<< 32980 1727096588.58818: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.58937: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.58964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 32980 1727096588.58977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 32980 1727096588.59015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 32980 1727096588.59089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 32980 1727096588.59119: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 32980 1727096588.59122: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e23230> <<< 32980 1727096588.59137: stdout chunk (state=3): >>>import '_typing' # <<< 32980 1727096588.59309: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e06120> <<< 32980 1727096588.59316: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e05280> <<< 32980 1727096588.59324: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.59349: stdout chunk (state=3): >>>import 'ansible' # <<< 32980 1727096588.59353: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.59382: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.59394: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.59404: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 32980 1727096588.59419: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.60806: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.61971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 32980 1727096588.62012: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e21100> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 32980 1727096588.62016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.62028: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 32980 1727096588.62059: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 32980 1727096588.62095: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472e56b70> <<< 32980 1727096588.62116: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e56900> <<< 32980 1727096588.62162: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e56240> <<< 32980 1727096588.62187: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 32980 1727096588.62230: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e569c0> <<< 32980 1727096588.62236: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e23c50> import 'atexit' # <<< 32980 1727096588.62258: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472e57890> <<< 32980 1727096588.62305: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472e57ad0> <<< 32980 1727096588.62350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32980 1727096588.62356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 32980 1727096588.62359: stdout chunk (state=3): >>>import '_locale' # <<< 32980 1727096588.62408: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e57f50> <<< 32980 1727096588.62435: stdout chunk (state=3): >>>import 'pwd' # <<< 32980 1727096588.62438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 32980 1727096588.62470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32980 1727096588.62490: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472725dc0> <<< 32980 1727096588.62522: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.62534: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4727279e0> <<< 32980 1727096588.62550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 32980 1727096588.62577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 32980 1727096588.62624: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4727283e0> <<< 32980 1727096588.62629: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 32980 1727096588.62652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 32980 1727096588.62675: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472729580> <<< 32980 1727096588.62691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32980 1727096588.62723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 32980 1727096588.62760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 32980 1727096588.62791: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47272bf50> <<< 32980 1727096588.62846: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.62871: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472730140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47272a300> <<< 32980 1727096588.62884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 32980 1727096588.62927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 32980 1727096588.62960: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32980 1727096588.63085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 32980 1727096588.63101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 32980 1727096588.63134: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472733f50> import '_tokenize' # <<< 32980 1727096588.63196: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472732a20> <<< 32980 1727096588.63202: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472732780> <<< 32980 1727096588.63208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 32980 1727096588.63236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 32980 1727096588.63296: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472732cf0> <<< 32980 1727096588.63359: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47272a7e0> <<< 32980 1727096588.63365: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472777fe0> <<< 32980 1727096588.63400: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472778350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32980 1727096588.63410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 32980 1727096588.63442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 32980 1727096588.63487: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472779dc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472779b80> <<< 32980 1727096588.63504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 32980 1727096588.63536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 32980 1727096588.63570: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47277c290> <<< 32980 1727096588.63588: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47277a450> <<< 32980 1727096588.63607: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 32980 1727096588.63650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.63671: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 32980 1727096588.63702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 32980 1727096588.63722: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47277fa70> <<< 32980 1727096588.63842: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47277c440> <<< 32980 1727096588.63895: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.63907: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472780860> <<< 32980 1727096588.63936: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472780b00> <<< 32980 1727096588.63979: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.63982: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472780c20> <<< 32980 1727096588.64000: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472778500> <<< 32980 1727096588.64012: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 32980 1727096588.64044: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 32980 1727096588.64059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 32980 1727096588.64083: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.64109: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47260c260><<< 32980 1727096588.64114: stdout chunk (state=3): >>> <<< 32980 1727096588.64260: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.64271: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47260d520> <<< 32980 1727096588.64294: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472782a20> <<< 32980 1727096588.64327: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472783da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472782630> # zipimport: zlib available <<< 32980 1727096588.64363: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 32980 1727096588.64447: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.64529: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.64584: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 32980 1727096588.64598: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 32980 1727096588.64614: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.64715: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.64829: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.65370: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.65911: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 32980 1727096588.65931: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 32980 1727096588.65957: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 32980 1727096588.65976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.66034: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472611670> <<< 32980 1727096588.66151: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 32980 1727096588.66154: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726124b0> <<< 32980 1727096588.66156: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47260d640> <<< 32980 1727096588.66192: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 32980 1727096588.66206: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.66229: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.66247: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 32980 1727096588.66255: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.66399: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.66553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 32980 1727096588.66577: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472612480> <<< 32980 1727096588.66580: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67041: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67481: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67553: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67628: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 32980 1727096588.67640: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67677: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67718: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 32980 1727096588.67722: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67790: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67873: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32980 1727096588.67890: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67907: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 32980 1727096588.67924: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.67963: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.68004: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 32980 1727096588.68017: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.68241: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.68477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32980 1727096588.68532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 32980 1727096588.68551: stdout chunk (state=3): >>>import '_ast' # <<< 32980 1727096588.68615: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726136b0> <<< 32980 1727096588.68633: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.68712: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.68807: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 32980 1727096588.68828: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 32980 1727096588.68870: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.68920: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 32980 1727096588.68983: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69016: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69067: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69133: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 32980 1727096588.69196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.69295: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47261e240> <<< 32980 1727096588.69326: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726198e0> <<< 32980 1727096588.69359: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 32980 1727096588.69363: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69420: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69480: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69514: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69550: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.69580: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 32980 1727096588.69604: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 32980 1727096588.69626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 32980 1727096588.69686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 32980 1727096588.69718: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 32980 1727096588.69733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32980 1727096588.69782: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472706a80> <<< 32980 1727096588.69832: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4727fe750> <<< 32980 1727096588.69912: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47261e300> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47260d790> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 32980 1727096588.69954: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69957: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.69983: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 32980 1727096588.70055: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 32980 1727096588.70090: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 32980 1727096588.70156: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70225: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70241: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70252: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70296: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70342: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70390: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 32980 1727096588.70434: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70506: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70569: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70593: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70632: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 32980 1727096588.70635: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70838: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.70988: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.71032: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.71090: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.71139: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 32980 1727096588.71154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 32980 1727096588.71200: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 32980 1727096588.71231: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b2420> <<< 32980 1727096588.71234: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 32980 1727096588.71248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 32980 1727096588.71262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 32980 1727096588.71306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 32980 1727096588.71334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 32980 1727096588.71352: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230c260> <<< 32980 1727096588.71379: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.71411: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47230c650> <<< 32980 1727096588.71475: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47269c0b0> <<< 32980 1727096588.71491: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b2f60> <<< 32980 1727096588.71504: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b0ad0> <<< 32980 1727096588.71521: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b0710> <<< 32980 1727096588.71529: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 32980 1727096588.71577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 32980 1727096588.71609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 32980 1727096588.71624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 32980 1727096588.71639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 32980 1727096588.71647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 32980 1727096588.71671: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.71679: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47230f590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230ee40> <<< 32980 1727096588.71709: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.71722: stdout chunk (state=3): >>>import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47230f020> <<< 32980 1727096588.71743: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230e270> <<< 32980 1727096588.71755: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 32980 1727096588.71871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 32980 1727096588.71886: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230f740> <<< 32980 1727096588.71903: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 32980 1727096588.71937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 32980 1727096588.71968: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.71976: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472372210> <<< 32980 1727096588.71999: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230cc50> <<< 32980 1727096588.72027: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b0740> import 'ansible.module_utils.facts.timeout' # <<< 32980 1727096588.72047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 32980 1727096588.72070: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72085: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 32980 1727096588.72099: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72158: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 32980 1727096588.72236: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72289: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 32980 1727096588.72352: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72363: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 32980 1727096588.72387: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72417: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 32980 1727096588.72452: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72512: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 32980 1727096588.72562: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72616: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 32980 1727096588.72671: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72728: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72788: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72872: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.72909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 32980 1727096588.72921: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.73415: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.73850: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 32980 1727096588.73856: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.73910: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.73966: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74003: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74037: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 32980 1727096588.74059: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74086: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74121: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 32980 1727096588.74126: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74186: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 32980 1727096588.74261: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74288: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 32980 1727096588.74327: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74365: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 32980 1727096588.74406: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74483: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 32980 1727096588.74626: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472371f70> <<< 32980 1727096588.74639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 32980 1727096588.74675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 32980 1727096588.74777: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472372f90> <<< 32980 1727096588.74786: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 32980 1727096588.74797: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74879: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.74932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 32980 1727096588.74943: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75030: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75122: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 32980 1727096588.75131: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75199: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75275: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 32980 1727096588.75286: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75323: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 32980 1727096588.75423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 32980 1727096588.75510: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.75557: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4723ae480> <<< 32980 1727096588.75772: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47239e1b0> import 'ansible.module_utils.facts.system.python' # <<< 32980 1727096588.75791: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75834: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 32980 1727096588.75907: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.75983: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76056: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76174: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76331: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 32980 1727096588.76349: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76376: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76414: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 32980 1727096588.76429: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76466: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76528: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 32980 1727096588.76578: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096588.76596: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4723c1d00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4723c1c40> import 'ansible.module_utils.facts.system.user' # <<< 32980 1727096588.76623: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 32980 1727096588.76640: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76693: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.76730: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 32980 1727096588.76905: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 32980 1727096588.77152: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77256: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77297: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77339: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 32980 1727096588.77356: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 32980 1727096588.77359: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77381: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77413: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77544: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77688: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 32980 1727096588.77703: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77825: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77946: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 32980 1727096588.77960: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.77994: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.78032: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.78599: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 32980 1727096588.79132: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79249: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79370: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 32980 1727096588.79383: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79459: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79558: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 32980 1727096588.79564: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79785: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79864: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 32980 1727096588.79892: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79896: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 32980 1727096588.79919: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.79994: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 32980 1727096588.80031: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80120: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80239: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80417: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 32980 1727096588.80623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 32980 1727096588.80639: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80729: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 32980 1727096588.80735: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80777: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.80789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 32980 1727096588.80883: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 32980 1727096588.81110: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 32980 1727096588.81131: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81183: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81242: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 32980 1727096588.81255: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81518: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81771: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 32980 1727096588.81805: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81909: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # <<< 32980 1727096588.81934: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81958: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.81999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 32980 1727096588.82008: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82036: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 32980 1727096588.82081: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82112: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 32980 1727096588.82157: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82231: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82315: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 32980 1727096588.82330: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82345: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 32980 1727096588.82365: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82408: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82453: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 32980 1727096588.82464: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82486: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82507: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82558: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82605: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82679: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 32980 1727096588.82756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 32980 1727096588.82781: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82824: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.82879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 32980 1727096588.82885: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83086: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 32980 1727096588.83289: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83333: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 32980 1727096588.83390: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83440: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83481: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 32980 1727096588.83499: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83577: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83658: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 32980 1727096588.83687: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83779: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.83863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 32980 1727096588.83943: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096588.84302: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 32980 1727096588.84330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 32980 1727096588.84378: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4721bf0b0> <<< 32980 1727096588.84402: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4721bf230> <<< 32980 1727096588.84441: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4721bc0b0> <<< 32980 1727096588.95996: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472204e30> <<< 32980 1727096588.96020: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 32980 1727096588.96043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 32980 1727096588.96061: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472205010> <<< 32980 1727096588.96119: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 32980 1727096588.96125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096588.96154: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 32980 1727096588.96171: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472206960> <<< 32980 1727096588.96205: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472205fd0> <<< 32980 1727096588.96459: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 32980 1727096589.20871: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "03", "second": "08", "epoch": "1727096588", "epoch_int": "1727096588", "date": "2024-09-23", "time": "09:03:08", "iso8601_micro": "2024-09-23T13:03:08.849681Z", "iso8601": "2024-09-23T13:03:08Z", "iso8601_basic": "20240923T090308849681", "iso8601_basic_short": "20240923T090308", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.740234375, "5m": 0.59033203125, "15m": 0.35302734375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free<<< 32980 1727096589.20887: stdout chunk (state=3): >>>": 2951}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 731, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261790736384, "block_size": 4096, "block_total": 65519099, "block_available": 63913754, "block_used": 1605345, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding"<<< 32980 1727096589.20903: stdout chunk (state=3): >>>: "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32980 1727096589.21479: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 32980 1727096589.21513: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 32980 1727096589.21546: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig <<< 32980 1727096589.21557: stdout chunk (state=3): >>># cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno <<< 32980 1727096589.21588: stdout chunk (state=3): >>># cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing <<< 32980 1727096589.21617: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 32980 1727096589.21663: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic <<< 32980 1727096589.21751: stdout chunk (state=3): >>># cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd <<< 32980 1727096589.21755: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi <<< 32980 1727096589.21784: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 32980 1727096589.22107: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 32980 1727096589.22115: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 32980 1727096589.22155: stdout chunk (state=3): >>># destroy _bz2 <<< 32980 1727096589.22170: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 <<< 32980 1727096589.22188: stdout chunk (state=3): >>># destroy lzma # destroy zipfile._path <<< 32980 1727096589.22192: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 32980 1727096589.22198: stdout chunk (state=3): >>># destroy ipaddress <<< 32980 1727096589.22228: stdout chunk (state=3): >>># destroy ntpath <<< 32980 1727096589.22252: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 32980 1727096589.22256: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 32980 1727096589.22279: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 32980 1727096589.22286: stdout chunk (state=3): >>># destroy _locale <<< 32980 1727096589.22306: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 32980 1727096589.22354: stdout chunk (state=3): >>># destroy selinux <<< 32980 1727096589.22357: stdout chunk (state=3): >>># destroy shutil <<< 32980 1727096589.22374: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 32980 1727096589.22419: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 32980 1727096589.22426: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 32980 1727096589.22456: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 32980 1727096589.22459: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 32980 1727096589.22484: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 32980 1727096589.22497: stdout chunk (state=3): >>># destroy datetime <<< 32980 1727096589.22504: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 32980 1727096589.22523: stdout chunk (state=3): >>># destroy _ssl <<< 32980 1727096589.22539: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 32980 1727096589.22559: stdout chunk (state=3): >>># destroy json <<< 32980 1727096589.22584: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 32980 1727096589.22592: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 32980 1727096589.22609: stdout chunk (state=3): >>># destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 32980 1727096589.22614: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 32980 1727096589.22653: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 32980 1727096589.22678: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves <<< 32980 1727096589.22692: stdout chunk (state=3): >>># destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 32980 1727096589.22708: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 32980 1727096589.22719: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 32980 1727096589.22743: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 32980 1727096589.22754: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 32980 1727096589.22774: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig <<< 32980 1727096589.22793: stdout chunk (state=3): >>># cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 32980 1727096589.22812: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 32980 1727096589.22820: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 32980 1727096589.22823: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 32980 1727096589.22838: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32980 1727096589.22991: stdout chunk (state=3): >>># destroy sys.monitoring <<< 32980 1727096589.23006: stdout chunk (state=3): >>># destroy _socket <<< 32980 1727096589.23009: stdout chunk (state=3): >>># destroy _collections <<< 32980 1727096589.23039: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 32980 1727096589.23046: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 32980 1727096589.23075: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 32980 1727096589.23111: stdout chunk (state=3): >>># destroy _typing <<< 32980 1727096589.23120: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 32980 1727096589.23133: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 32980 1727096589.23155: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 32980 1727096589.23161: stdout chunk (state=3): >>> <<< 32980 1727096589.23244: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 32980 1727096589.23261: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math <<< 32980 1727096589.23265: stdout chunk (state=3): >>># destroy _bisect # destroy time <<< 32980 1727096589.23292: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 32980 1727096589.23327: stdout chunk (state=3): >>># destroy _hashlib <<< 32980 1727096589.23338: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re<<< 32980 1727096589.23363: stdout chunk (state=3): >>> <<< 32980 1727096589.23369: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 32980 1727096589.23372: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 32980 1727096589.23776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096589.23809: stderr chunk (state=3): >>><<< 32980 1727096589.23813: stdout chunk (state=3): >>><<< 32980 1727096589.23928: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4733104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4732dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473312a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473105130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473105fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473143da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473143fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47317b770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47317be00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47315ba40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473159190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473140f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47319b710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47319a330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47315a030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa473198b30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d07a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731401d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731d0c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d0b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731d0ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47313ecf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d15b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d1280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d24b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731e86e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731e9df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731eac60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731eb2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731ea1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4731ebd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731eb470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d2510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472ee7c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f107a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f10500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f107d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f11100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472f11af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f109b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472ee5df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f12ed0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f11c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4731d2c00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f3b200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f5f5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472fc03e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472fc2b40> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472fc0500> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f853d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472dc1430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f5e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472f13dd0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa472f5e9f0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_080v21st/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e23230> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e06120> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e05280> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e21100> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472e56b70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e56900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e56240> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e569c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e23c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472e57890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472e57ad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472e57f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472725dc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4727279e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4727283e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472729580> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47272bf50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472730140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47272a300> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472733f50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472732a20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472732780> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472732cf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47272a7e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472777fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472778350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472779dc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472779b80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47277c290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47277a450> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47277fa70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47277c440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472780860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472780b00> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472780c20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472778500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47260c260> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47260d520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472782a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472783da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472782630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472611670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726124b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47260d640> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472612480> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726136b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47261e240> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726198e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472706a80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4727fe750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47261e300> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47260d790> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b2420> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230c260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47230c650> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47269c0b0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b2f60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b0ad0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b0710> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47230f590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230ee40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa47230f020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230e270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230f740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa472372210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47230cc50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4726b0740> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472371f70> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472372f90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4723ae480> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa47239e1b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4723c1d00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4723c1c40> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa4721bf0b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4721bf230> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa4721bc0b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472204e30> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472205010> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472206960> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa472205fd0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "03", "second": "08", "epoch": "1727096588", "epoch_int": "1727096588", "date": "2024-09-23", "time": "09:03:08", "iso8601_micro": "2024-09-23T13:03:08.849681Z", "iso8601": "2024-09-23T13:03:08Z", "iso8601_basic": "20240923T090308849681", "iso8601_basic_short": "20240923T090308", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.740234375, "5m": 0.59033203125, "15m": 0.35302734375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 731, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261790736384, "block_size": 4096, "block_total": 65519099, "block_available": 63913754, "block_used": 1605345, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 32980 1727096589.25444: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096589.25447: _low_level_execute_command(): starting 32980 1727096589.25451: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096588.1202826-33019-76373297743404/ > /dev/null 2>&1 && sleep 0' 32980 1727096589.25453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096589.25458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096589.25460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096589.25462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096589.25465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096589.25471: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096589.25474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.25483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096589.25486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096589.25488: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096589.25490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096589.25499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096589.25502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096589.25504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096589.25506: stderr chunk (state=3): >>>debug2: match found <<< 32980 1727096589.25511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.25513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096589.25515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096589.25517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.25522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.27294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096589.27340: stderr chunk (state=3): >>><<< 32980 1727096589.27344: stdout chunk (state=3): >>><<< 32980 1727096589.27380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096589.27384: handler run complete 32980 1727096589.27458: variable 'ansible_facts' from source: unknown 32980 1727096589.27549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.28750: variable 'ansible_facts' from source: unknown 32980 1727096589.28832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.28909: attempt loop complete, returning result 32980 1727096589.28915: _execute() done 32980 1727096589.28918: dumping result to json 32980 1727096589.28933: done dumping result, returning 32980 1727096589.28941: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0afff68d-5257-457d-ef33-0000000000af] 32980 1727096589.28943: sending task result for task 0afff68d-5257-457d-ef33-0000000000af 32980 1727096589.29228: done sending task result for task 0afff68d-5257-457d-ef33-0000000000af 32980 1727096589.29231: WORKER PROCESS EXITING ok: [managed_node2] 32980 1727096589.29556: no more pending results, returning what we have 32980 1727096589.29559: results queue empty 32980 1727096589.29560: checking for any_errors_fatal 32980 1727096589.29561: done checking for any_errors_fatal 32980 1727096589.29562: checking for max_fail_percentage 32980 1727096589.29563: done checking for max_fail_percentage 32980 1727096589.29569: checking to see if all hosts have failed and the running result is not ok 32980 1727096589.29570: done checking to see if all hosts have failed 32980 1727096589.29571: getting the remaining hosts for this loop 32980 1727096589.29574: done getting the remaining hosts for this loop 32980 1727096589.29579: getting the next task for host managed_node2 32980 1727096589.29589: done getting next task for host managed_node2 32980 1727096589.29591: ^ task is: TASK: meta (flush_handlers) 32980 1727096589.29593: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096589.29597: getting variables 32980 1727096589.29599: in VariableManager get_vars() 32980 1727096589.29621: Calling all_inventory to load vars for managed_node2 32980 1727096589.29625: Calling groups_inventory to load vars for managed_node2 32980 1727096589.29628: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096589.29636: Calling all_plugins_play to load vars for managed_node2 32980 1727096589.29637: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096589.29639: Calling groups_plugins_play to load vars for managed_node2 32980 1727096589.29751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.29959: done with get_vars() 32980 1727096589.29966: done getting variables 32980 1727096589.30016: in VariableManager get_vars() 32980 1727096589.30022: Calling all_inventory to load vars for managed_node2 32980 1727096589.30024: Calling groups_inventory to load vars for managed_node2 32980 1727096589.30025: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096589.30028: Calling all_plugins_play to load vars for managed_node2 32980 1727096589.30033: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096589.30035: Calling groups_plugins_play to load vars for managed_node2 32980 1727096589.30118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.30223: done with get_vars() 32980 1727096589.30232: done queuing things up, now waiting for results queue to drain 32980 1727096589.30234: results queue empty 32980 1727096589.30234: checking for any_errors_fatal 32980 1727096589.30236: done checking for any_errors_fatal 32980 1727096589.30236: checking for max_fail_percentage 32980 1727096589.30237: done checking for max_fail_percentage 32980 1727096589.30237: checking to see if all hosts have failed and the running result is not ok 32980 1727096589.30238: done checking to see if all hosts have failed 32980 1727096589.30238: getting the remaining hosts for this loop 32980 1727096589.30239: done getting the remaining hosts for this loop 32980 1727096589.30240: getting the next task for host managed_node2 32980 1727096589.30243: done getting next task for host managed_node2 32980 1727096589.30245: ^ task is: TASK: Include the task 'el_repo_setup.yml' 32980 1727096589.30246: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096589.30247: getting variables 32980 1727096589.30248: in VariableManager get_vars() 32980 1727096589.30253: Calling all_inventory to load vars for managed_node2 32980 1727096589.30255: Calling groups_inventory to load vars for managed_node2 32980 1727096589.30256: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096589.30259: Calling all_plugins_play to load vars for managed_node2 32980 1727096589.30261: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096589.30264: Calling groups_plugins_play to load vars for managed_node2 32980 1727096589.30344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.30462: done with get_vars() 32980 1727096589.30468: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:11 Monday 23 September 2024 09:03:09 -0400 (0:00:01.222) 0:00:01.231 ****** 32980 1727096589.30525: entering _queue_task() for managed_node2/include_tasks 32980 1727096589.30526: Creating lock for include_tasks 32980 1727096589.30765: worker is 1 (out of 1 available) 32980 1727096589.30782: exiting _queue_task() for managed_node2/include_tasks 32980 1727096589.30793: done queuing things up, now waiting for results queue to drain 32980 1727096589.30794: waiting for pending results... 32980 1727096589.30932: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 32980 1727096589.30991: in run() - task 0afff68d-5257-457d-ef33-000000000006 32980 1727096589.31045: variable 'ansible_search_path' from source: unknown 32980 1727096589.31050: calling self._execute() 32980 1727096589.31096: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096589.31099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096589.31108: variable 'omit' from source: magic vars 32980 1727096589.31188: _execute() done 32980 1727096589.31192: dumping result to json 32980 1727096589.31195: done dumping result, returning 32980 1727096589.31198: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-457d-ef33-000000000006] 32980 1727096589.31205: sending task result for task 0afff68d-5257-457d-ef33-000000000006 32980 1727096589.31302: done sending task result for task 0afff68d-5257-457d-ef33-000000000006 32980 1727096589.31304: WORKER PROCESS EXITING 32980 1727096589.31343: no more pending results, returning what we have 32980 1727096589.31347: in VariableManager get_vars() 32980 1727096589.31382: Calling all_inventory to load vars for managed_node2 32980 1727096589.31385: Calling groups_inventory to load vars for managed_node2 32980 1727096589.31388: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096589.31399: Calling all_plugins_play to load vars for managed_node2 32980 1727096589.31402: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096589.31405: Calling groups_plugins_play to load vars for managed_node2 32980 1727096589.31528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.31634: done with get_vars() 32980 1727096589.31639: variable 'ansible_search_path' from source: unknown 32980 1727096589.31649: we have included files to process 32980 1727096589.31649: generating all_blocks data 32980 1727096589.31650: done generating all_blocks data 32980 1727096589.31651: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32980 1727096589.31651: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32980 1727096589.31653: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32980 1727096589.32094: in VariableManager get_vars() 32980 1727096589.32103: done with get_vars() 32980 1727096589.32110: done processing included file 32980 1727096589.32112: iterating over new_blocks loaded from include file 32980 1727096589.32113: in VariableManager get_vars() 32980 1727096589.32136: done with get_vars() 32980 1727096589.32138: filtering new block on tags 32980 1727096589.32155: done filtering new block on tags 32980 1727096589.32163: in VariableManager get_vars() 32980 1727096589.32176: done with get_vars() 32980 1727096589.32178: filtering new block on tags 32980 1727096589.32200: done filtering new block on tags 32980 1727096589.32203: in VariableManager get_vars() 32980 1727096589.32214: done with get_vars() 32980 1727096589.32215: filtering new block on tags 32980 1727096589.32238: done filtering new block on tags 32980 1727096589.32240: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 32980 1727096589.32246: extending task lists for all hosts with included blocks 32980 1727096589.32302: done extending task lists 32980 1727096589.32305: done processing included files 32980 1727096589.32306: results queue empty 32980 1727096589.32307: checking for any_errors_fatal 32980 1727096589.32308: done checking for any_errors_fatal 32980 1727096589.32309: checking for max_fail_percentage 32980 1727096589.32310: done checking for max_fail_percentage 32980 1727096589.32311: checking to see if all hosts have failed and the running result is not ok 32980 1727096589.32311: done checking to see if all hosts have failed 32980 1727096589.32312: getting the remaining hosts for this loop 32980 1727096589.32313: done getting the remaining hosts for this loop 32980 1727096589.32315: getting the next task for host managed_node2 32980 1727096589.32320: done getting next task for host managed_node2 32980 1727096589.32322: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 32980 1727096589.32324: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096589.32326: getting variables 32980 1727096589.32327: in VariableManager get_vars() 32980 1727096589.32333: Calling all_inventory to load vars for managed_node2 32980 1727096589.32338: Calling groups_inventory to load vars for managed_node2 32980 1727096589.32340: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096589.32348: Calling all_plugins_play to load vars for managed_node2 32980 1727096589.32351: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096589.32356: Calling groups_plugins_play to load vars for managed_node2 32980 1727096589.32537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.32723: done with get_vars() 32980 1727096589.32733: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 09:03:09 -0400 (0:00:00.022) 0:00:01.254 ****** 32980 1727096589.32820: entering _queue_task() for managed_node2/setup 32980 1727096589.33139: worker is 1 (out of 1 available) 32980 1727096589.33151: exiting _queue_task() for managed_node2/setup 32980 1727096589.33163: done queuing things up, now waiting for results queue to drain 32980 1727096589.33165: waiting for pending results... 32980 1727096589.33381: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 32980 1727096589.33454: in run() - task 0afff68d-5257-457d-ef33-0000000000c0 32980 1727096589.33463: variable 'ansible_search_path' from source: unknown 32980 1727096589.33466: variable 'ansible_search_path' from source: unknown 32980 1727096589.33518: calling self._execute() 32980 1727096589.33590: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096589.33677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096589.33682: variable 'omit' from source: magic vars 32980 1727096589.34123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096589.35816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096589.35862: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096589.35896: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096589.35930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096589.35950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096589.36015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096589.36036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096589.36052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096589.36081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096589.36092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096589.36210: variable 'ansible_facts' from source: unknown 32980 1727096589.36256: variable 'network_test_required_facts' from source: task vars 32980 1727096589.36288: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 32980 1727096589.36291: variable 'omit' from source: magic vars 32980 1727096589.36404: variable 'omit' from source: magic vars 32980 1727096589.36408: variable 'omit' from source: magic vars 32980 1727096589.36410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096589.36420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096589.36438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096589.36471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096589.36481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096589.36507: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096589.36510: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096589.36515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096589.36630: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096589.36666: Set connection var ansible_timeout to 10 32980 1727096589.36670: Set connection var ansible_shell_type to sh 32980 1727096589.36675: Set connection var ansible_connection to ssh 32980 1727096589.36701: Set connection var ansible_shell_executable to /bin/sh 32980 1727096589.36705: Set connection var ansible_pipelining to False 32980 1727096589.36713: variable 'ansible_shell_executable' from source: unknown 32980 1727096589.36715: variable 'ansible_connection' from source: unknown 32980 1727096589.36718: variable 'ansible_module_compression' from source: unknown 32980 1727096589.36720: variable 'ansible_shell_type' from source: unknown 32980 1727096589.36723: variable 'ansible_shell_executable' from source: unknown 32980 1727096589.36725: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096589.36729: variable 'ansible_pipelining' from source: unknown 32980 1727096589.36731: variable 'ansible_timeout' from source: unknown 32980 1727096589.36735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096589.36851: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096589.36858: variable 'omit' from source: magic vars 32980 1727096589.36863: starting attempt loop 32980 1727096589.36866: running the handler 32980 1727096589.36881: _low_level_execute_command(): starting 32980 1727096589.36887: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096589.37419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096589.37423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.37425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096589.37427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.37497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096589.37505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.37543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.39264: stdout chunk (state=3): >>>/root <<< 32980 1727096589.39349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096589.39393: stderr chunk (state=3): >>><<< 32980 1727096589.39404: stdout chunk (state=3): >>><<< 32980 1727096589.39429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096589.39452: _low_level_execute_command(): starting 32980 1727096589.39456: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218 `" && echo ansible-tmp-1727096589.3942835-33061-50359637662218="` echo /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218 `" ) && sleep 0' 32980 1727096589.40043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096589.40110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.40166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096589.40186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096589.40189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.40228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.42205: stdout chunk (state=3): >>>ansible-tmp-1727096589.3942835-33061-50359637662218=/root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218 <<< 32980 1727096589.42332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096589.42398: stderr chunk (state=3): >>><<< 32980 1727096589.42402: stdout chunk (state=3): >>><<< 32980 1727096589.42454: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096589.3942835-33061-50359637662218=/root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096589.42492: variable 'ansible_module_compression' from source: unknown 32980 1727096589.42530: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32980 1727096589.42579: variable 'ansible_facts' from source: unknown 32980 1727096589.42875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/AnsiballZ_setup.py 32980 1727096589.42963: Sending initial data 32980 1727096589.43013: Sent initial data (153 bytes) 32980 1727096589.43595: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096589.43617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096589.43628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.43672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096589.43686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.43729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.45390: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096589.45423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096589.45451: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpxqnr6qdu /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/AnsiballZ_setup.py <<< 32980 1727096589.45463: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/AnsiballZ_setup.py" <<< 32980 1727096589.45490: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpxqnr6qdu" to remote "/root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/AnsiballZ_setup.py" <<< 32980 1727096589.45492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/AnsiballZ_setup.py" <<< 32980 1727096589.46486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096589.46528: stderr chunk (state=3): >>><<< 32980 1727096589.46531: stdout chunk (state=3): >>><<< 32980 1727096589.46548: done transferring module to remote 32980 1727096589.46563: _low_level_execute_command(): starting 32980 1727096589.46568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/ /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/AnsiballZ_setup.py && sleep 0' 32980 1727096589.47389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.47475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.49528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096589.49559: stderr chunk (state=3): >>><<< 32980 1727096589.49570: stdout chunk (state=3): >>><<< 32980 1727096589.49601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096589.49609: _low_level_execute_command(): starting 32980 1727096589.49618: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/AnsiballZ_setup.py && sleep 0' 32980 1727096589.50268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096589.50284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096589.50296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096589.50311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096589.50326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096589.50335: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096589.50353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.50455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096589.50484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096589.50502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.50593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.52850: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 32980 1727096589.52876: stdout chunk (state=3): >>>import _imp # builtin <<< 32980 1727096589.52909: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 32980 1727096589.52980: stdout chunk (state=3): >>>import '_io' # <<< 32980 1727096589.52993: stdout chunk (state=3): >>>import 'marshal' # <<< 32980 1727096589.53025: stdout chunk (state=3): >>>import 'posix' # <<< 32980 1727096589.53059: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 32980 1727096589.53097: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 32980 1727096589.53101: stdout chunk (state=3): >>># installed zipimport hook <<< 32980 1727096589.53148: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.53171: stdout chunk (state=3): >>>import '_codecs' # <<< 32980 1727096589.53194: stdout chunk (state=3): >>>import 'codecs' # <<< 32980 1727096589.53229: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 32980 1727096589.53252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 32980 1727096589.53274: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ed684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ed37b30> <<< 32980 1727096589.53299: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 32980 1727096589.53315: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ed6aa50> <<< 32980 1727096589.53335: stdout chunk (state=3): >>>import '_signal' # <<< 32980 1727096589.53358: stdout chunk (state=3): >>>import '_abc' # <<< 32980 1727096589.53371: stdout chunk (state=3): >>>import 'abc' # <<< 32980 1727096589.53385: stdout chunk (state=3): >>>import 'io' # <<< 32980 1727096589.53421: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 32980 1727096589.53509: stdout chunk (state=3): >>>import '_collections_abc' # <<< 32980 1727096589.53539: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 32980 1727096589.53571: stdout chunk (state=3): >>>import 'os' # <<< 32980 1727096589.53594: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 32980 1727096589.53601: stdout chunk (state=3): >>>Processing user site-packages <<< 32980 1727096589.53619: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 32980 1727096589.53631: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 32980 1727096589.53642: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 32980 1727096589.53666: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 32980 1727096589.53677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 32980 1727096589.53692: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb3d130> <<< 32980 1727096589.53748: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 32980 1727096589.53762: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.53770: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb3dfa0> <<< 32980 1727096589.53796: stdout chunk (state=3): >>>import 'site' # <<< 32980 1727096589.53828: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 32980 1727096589.54212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 32980 1727096589.54251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 32980 1727096589.54254: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 32980 1727096589.54287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 32980 1727096589.54337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 32980 1727096589.54350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 32980 1727096589.54382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 32980 1727096589.54412: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb7be30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 32980 1727096589.54452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 32980 1727096589.54455: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb7bef0> <<< 32980 1727096589.54481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 32980 1727096589.54500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 32980 1727096589.54533: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 32980 1727096589.54583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.54614: stdout chunk (state=3): >>>import 'itertools' # <<< 32980 1727096589.54617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 32980 1727096589.54665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebb3860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 32980 1727096589.54670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebb3ef0> <<< 32980 1727096589.54694: stdout chunk (state=3): >>>import '_collections' # <<< 32980 1727096589.54727: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb93b00> <<< 32980 1727096589.54749: stdout chunk (state=3): >>>import '_functools' # <<< 32980 1727096589.54777: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb91220> <<< 32980 1727096589.54932: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb78fe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 32980 1727096589.54950: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 32980 1727096589.54989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 32980 1727096589.54999: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 32980 1727096589.55050: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebd37d0> <<< 32980 1727096589.55078: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebd23f0> <<< 32980 1727096589.55087: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb92210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebd0c50> <<< 32980 1727096589.55141: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 32980 1727096589.55147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 32980 1727096589.55166: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec08800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb78290> <<< 32980 1727096589.55188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 32980 1727096589.55227: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.55232: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec08cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec08b60> <<< 32980 1727096589.55272: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec08ef0> <<< 32980 1727096589.55293: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb76db0> <<< 32980 1727096589.55318: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.55342: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 32980 1727096589.55382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 32980 1727096589.55396: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec09580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec09250> <<< 32980 1727096589.55401: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 32980 1727096589.55433: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 32980 1727096589.55452: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec0a450> <<< 32980 1727096589.55477: stdout chunk (state=3): >>>import 'importlib.util' # <<< 32980 1727096589.55480: stdout chunk (state=3): >>>import 'runpy' # <<< 32980 1727096589.55507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 32980 1727096589.55537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 32980 1727096589.55571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 32980 1727096589.55579: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec20680> <<< 32980 1727096589.55598: stdout chunk (state=3): >>>import 'errno' # <<< 32980 1727096589.55628: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.55631: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.55636: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec21d60> <<< 32980 1727096589.55662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 32980 1727096589.55664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 32980 1727096589.55704: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 32980 1727096589.55707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 32980 1727096589.55718: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec22c00> <<< 32980 1727096589.55754: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.55758: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec23260> <<< 32980 1727096589.55776: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec22150> <<< 32980 1727096589.55788: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 32980 1727096589.55806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 32980 1727096589.55845: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.55848: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec23ce0> <<< 32980 1727096589.55869: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec23410> <<< 32980 1727096589.55911: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec0a4b0> <<< 32980 1727096589.55934: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 32980 1727096589.55956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 32980 1727096589.55981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 32980 1727096589.56001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 32980 1727096589.56035: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e923c20> <<< 32980 1727096589.56059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 32980 1727096589.56063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 32980 1727096589.56092: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.56096: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94c650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94c3b0> <<< 32980 1727096589.56129: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.56134: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94c680> <<< 32980 1727096589.56160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 32980 1727096589.56244: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.56363: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94cfb0> <<< 32980 1727096589.56494: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.56501: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94d9a0> <<< 32980 1727096589.56515: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94c860> <<< 32980 1727096589.56537: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e921dc0> <<< 32980 1727096589.56555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 32980 1727096589.56583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 32980 1727096589.56602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 32980 1727096589.56619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 32980 1727096589.56624: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94ed80> <<< 32980 1727096589.56651: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94dac0> <<< 32980 1727096589.56668: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec0aba0> <<< 32980 1727096589.56697: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 32980 1727096589.56756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.56780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 32980 1727096589.56813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 32980 1727096589.56842: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e977110> <<< 32980 1727096589.56897: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 32980 1727096589.56917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.56935: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 32980 1727096589.56958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 32980 1727096589.57000: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e99b4a0> <<< 32980 1727096589.57027: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 32980 1727096589.57068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 32980 1727096589.57133: stdout chunk (state=3): >>>import 'ntpath' # <<< 32980 1727096589.57163: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9fc260> <<< 32980 1727096589.57189: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 32980 1727096589.57232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 32980 1727096589.57235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 32980 1727096589.57282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 32980 1727096589.57363: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9fe9c0> <<< 32980 1727096589.57446: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9fc380> <<< 32980 1727096589.57487: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9c5250> <<< 32980 1727096589.57519: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e329340> <<< 32980 1727096589.57532: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e99a2d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94fce0> <<< 32980 1727096589.57718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 32980 1727096589.57730: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd8e3295e0> <<< 32980 1727096589.58003: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_9qgtmecg/ansible_setup_payload.zip' # zipimport: zlib available <<< 32980 1727096589.58130: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.58178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 32980 1727096589.58198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 32980 1727096589.58275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 32980 1727096589.58304: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e393080> <<< 32980 1727096589.58317: stdout chunk (state=3): >>>import '_typing' # <<< 32980 1727096589.58495: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e371f70> <<< 32980 1727096589.58511: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e371100> # zipimport: zlib available <<< 32980 1727096589.58538: stdout chunk (state=3): >>>import 'ansible' # <<< 32980 1727096589.58556: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.58571: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.58593: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.58604: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 32980 1727096589.58620: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.60029: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.61174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e390f20> <<< 32980 1727096589.61204: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.61232: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 32980 1727096589.61235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 32980 1727096589.61280: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 32980 1727096589.61293: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e3c2960> <<< 32980 1727096589.61329: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c26f0> <<< 32980 1727096589.61400: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c2000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 32980 1727096589.61441: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c2ae0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e393aa0> <<< 32980 1727096589.61466: stdout chunk (state=3): >>>import 'atexit' # <<< 32980 1727096589.61519: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e3c3650> <<< 32980 1727096589.61543: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e3c3800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32980 1727096589.61574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 32980 1727096589.61594: stdout chunk (state=3): >>>import '_locale' # <<< 32980 1727096589.61636: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c3ce0> <<< 32980 1727096589.61644: stdout chunk (state=3): >>>import 'pwd' # <<< 32980 1727096589.61663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 32980 1727096589.61693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32980 1727096589.61729: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e22db20> <<< 32980 1727096589.61758: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e22f740> <<< 32980 1727096589.61786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 32980 1727096589.61798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 32980 1727096589.61840: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e230110> <<< 32980 1727096589.61855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 32980 1727096589.61888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 32980 1727096589.61903: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2312b0> <<< 32980 1727096589.61925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32980 1727096589.61959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 32980 1727096589.61984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 32980 1727096589.62039: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e233d70> <<< 32980 1727096589.62078: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e373170> <<< 32980 1727096589.62100: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e232030> <<< 32980 1727096589.62122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 32980 1727096589.62150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 32980 1727096589.62176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 32980 1727096589.62199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32980 1727096589.62314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 32980 1727096589.62341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 32980 1727096589.62360: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23bc50> <<< 32980 1727096589.62371: stdout chunk (state=3): >>>import '_tokenize' # <<< 32980 1727096589.62437: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23a720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23a480> <<< 32980 1727096589.62471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 32980 1727096589.62474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 32980 1727096589.62546: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23a9f0> <<< 32980 1727096589.62572: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2324b0> <<< 32980 1727096589.62606: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e27ff50> <<< 32980 1727096589.62634: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e280050> <<< 32980 1727096589.62662: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32980 1727096589.62679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 32980 1727096589.62706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 32980 1727096589.62742: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.62748: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e281b20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e281910> <<< 32980 1727096589.62766: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 32980 1727096589.62801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 32980 1727096589.62851: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e283f80> <<< 32980 1727096589.62855: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e282180> <<< 32980 1727096589.62878: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 32980 1727096589.62926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.62939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 32980 1727096589.62957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 32980 1727096589.62962: stdout chunk (state=3): >>>import '_string' # <<< 32980 1727096589.63009: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2876b0> <<< 32980 1727096589.63133: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e283ef0> <<< 32980 1727096589.63194: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e288770> <<< 32980 1727096589.63228: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e2888f0> <<< 32980 1727096589.63277: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e2889b0> <<< 32980 1727096589.63291: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e280230> <<< 32980 1727096589.63318: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 32980 1727096589.63337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 32980 1727096589.63364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 32980 1727096589.63390: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.63427: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e28bfe0> <<< 32980 1727096589.63637: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e115190> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e28a7b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e28bb60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e28a420> <<< 32980 1727096589.63666: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 32980 1727096589.63695: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.63784: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.63877: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.63911: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 32980 1727096589.63927: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 32980 1727096589.63954: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.64066: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.64209: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.64740: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.65313: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 32980 1727096589.65317: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 32980 1727096589.65340: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 32980 1727096589.65360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.65417: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e1192b0> <<< 32980 1727096589.65510: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 32980 1727096589.65533: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11a090> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2884a0> <<< 32980 1727096589.65594: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 32980 1727096589.65622: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.65650: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 32980 1727096589.65659: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.65893: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.66000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 32980 1727096589.66020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11a1e0> <<< 32980 1727096589.66037: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.66521: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.66954: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67024: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67113: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 32980 1727096589.67120: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67144: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67185: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 32980 1727096589.67194: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67252: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67358: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32980 1727096589.67381: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 32980 1727096589.67419: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67433: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67465: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 32980 1727096589.67491: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67702: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.67945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32980 1727096589.68025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 32980 1727096589.68028: stdout chunk (state=3): >>>import '_ast' # <<< 32980 1727096589.68090: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11b290> <<< 32980 1727096589.68101: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68180: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68257: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 32980 1727096589.68278: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 32980 1727096589.68306: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68333: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68374: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 32980 1727096589.68382: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68448: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68511: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68542: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.68659: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.68790: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e125dc0> <<< 32980 1727096589.68853: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e123e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 32980 1727096589.68904: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.68989: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.69094: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.69145: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 32980 1727096589.69208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32980 1727096589.69262: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e20e6c0> <<< 32980 1727096589.69328: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3ee390> <<< 32980 1727096589.69399: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e125d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11b020> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 32980 1727096589.69454: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.69538: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 32980 1727096589.69623: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.69710: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.69728: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.69750: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.69806: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.69842: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.69887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 32980 1727096589.69890: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.69991: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.70095: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.70129: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 32980 1727096589.70286: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.70443: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.70510: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.70561: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096589.70613: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 32980 1727096589.70652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b5b50> <<< 32980 1727096589.70694: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 32980 1727096589.70697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 32980 1727096589.70707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 32980 1727096589.70995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd4fd10> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dd4ff80> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e19e6c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b66f0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b4230> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b7ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 32980 1727096589.71033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 32980 1727096589.71070: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 32980 1727096589.71089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 32980 1727096589.71128: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dd57020> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd568d0> <<< 32980 1727096589.71229: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dd56ab0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd55d00> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 32980 1727096589.71321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 32980 1727096589.71349: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd571d0> <<< 32980 1727096589.71499: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 32980 1727096589.71546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ddb5d00> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd57ce0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b7e90> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 32980 1727096589.71562: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.71603: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.71653: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 32980 1727096589.71684: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.71747: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.71817: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 32980 1727096589.71857: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.71905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 32980 1727096589.71946: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.72011: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 32980 1727096589.72014: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.72051: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.72098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 32980 1727096589.72107: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.72160: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.72224: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.72283: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.72345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 32980 1727096589.72362: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 32980 1727096589.72840: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73286: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 32980 1727096589.73289: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73340: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73390: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73426: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73476: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 32980 1727096589.73495: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73514: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 32980 1727096589.73558: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73606: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73665: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 32980 1727096589.73706: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73717: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 32980 1727096589.73777: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73793: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 32980 1727096589.73838: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73906: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.73996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 32980 1727096589.74014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 32980 1727096589.74041: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ddb5a60> <<< 32980 1727096589.74052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 32980 1727096589.74076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 32980 1727096589.74222: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ddb6960> import 'ansible.module_utils.facts.system.local' # <<< 32980 1727096589.74240: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.74278: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.74352: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 32980 1727096589.74372: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.74546: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.74612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.74741: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 32980 1727096589.74745: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.74838: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.74887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 32980 1727096589.74993: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.74997: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ddf20c0> <<< 32980 1727096589.75185: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dde1eb0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 32980 1727096589.75269: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.75356: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 32980 1727096589.75428: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.75500: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.75638: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.75804: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.75876: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 32980 1727096589.75963: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.76003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096589.76122: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8de05e80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dde31d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 32980 1727096589.76159: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 32980 1727096589.76311: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.76497: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 32980 1727096589.76547: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.76726: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.76792: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.76817: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.76975: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.77135: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 32980 1727096589.77206: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.77326: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 32980 1727096589.77336: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.77369: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.77406: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.77979: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.78525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 32980 1727096589.78545: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 32980 1727096589.78635: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.78751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 32980 1727096589.78773: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.78846: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.79001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 32980 1727096589.79128: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.79281: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 32980 1727096589.79308: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 32980 1727096589.79316: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.79358: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.79428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 32980 1727096589.79437: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.79511: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.79638: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.79812: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 32980 1727096589.80044: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80071: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 32980 1727096589.80113: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80140: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80173: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 32980 1727096589.80181: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80252: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 32980 1727096589.80325: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80376: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 32980 1727096589.80396: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80442: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 32980 1727096589.80517: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80571: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 32980 1727096589.80641: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.80901: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81164: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 32980 1727096589.81179: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81237: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 32980 1727096589.81307: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81340: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81379: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 32980 1727096589.81389: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81417: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 32980 1727096589.81472: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81500: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81545: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 32980 1727096589.81548: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81629: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 32980 1727096589.81754: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 32980 1727096589.81792: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 32980 1727096589.81908: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.81957: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096589.82018: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82071: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 32980 1727096589.82154: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 32980 1727096589.82186: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82236: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 32980 1727096589.82470: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82663: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 32980 1727096589.82720: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82762: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 32980 1727096589.82783: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82820: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82876: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 32980 1727096589.82888: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.82983: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.83044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 32980 1727096589.83060: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.83144: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.83235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 32980 1727096589.83324: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096589.84241: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 32980 1727096589.84255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 32980 1727096589.84290: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 32980 1727096589.84342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 32980 1727096589.84368: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dc07ad0> <<< 32980 1727096589.84385: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dc06e40> <<< 32980 1727096589.84431: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dc043e0> <<< 32980 1727096589.84833: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "03", "second": "09", "epoch": "1727096589", "epoch_int": "1727096589", "date": "2024-09-23", "time": "09:03:09", "iso8601_micro": "2024-09-23T13:03:09.838437Z", "iso8601": "2024-09-23T13:03:09Z", "iso8601_basic": "20240923T090309838437", "iso8601_basic_short": "20240923T090309", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32980 1727096589.85447: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ <<< 32980 1727096589.85497: stdout chunk (state=3): >>># clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 32980 1727096589.85540: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat <<< 32980 1727096589.85606: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse <<< 32980 1727096589.85648: stdout chunk (state=3): >>># cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime <<< 32980 1727096589.85742: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl <<< 32980 1727096589.85803: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 32980 1727096589.86305: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 32980 1727096589.86360: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 32980 1727096589.86363: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 32980 1727096589.86429: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 32980 1727096589.86477: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 32980 1727096589.86517: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl <<< 32980 1727096589.86533: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 32980 1727096589.86572: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 32980 1727096589.86625: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 32980 1727096589.86658: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 32980 1727096589.86781: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 32980 1727096589.86931: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 32980 1727096589.86952: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32980 1727096589.87124: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 32980 1727096589.87230: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 32980 1727096589.87234: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 32980 1727096589.87252: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 32980 1727096589.87416: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 32980 1727096589.87494: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 32980 1727096589.87497: stdout chunk (state=3): >>># clear sys.audit hooks <<< 32980 1727096589.87855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096589.87866: stdout chunk (state=3): >>><<< 32980 1727096589.87889: stderr chunk (state=3): >>><<< 32980 1727096589.88079: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ed684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ed37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ed6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb3d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb3dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb7be30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb7bef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebb3860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebb3ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb93b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb91220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb78fe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebd37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebd23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb92210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ebd0c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec08800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb78290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec08cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec08b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec08ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8eb76db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec09580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec09250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec0a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec20680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec21d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec22c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec23260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec22150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ec23ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec23410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec0a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e923c20> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94c650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94c3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94cfb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e94d9a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94c860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e921dc0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94ed80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94dac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ec0aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e977110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e99b4a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9fc260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9fe9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9fc380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e9c5250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e329340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e99a2d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e94fce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd8e3295e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_9qgtmecg/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e393080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e371f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e371100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e390f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e3c2960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c26f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c2000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c2ae0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e393aa0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e3c3650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e3c3800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3c3ce0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e22db20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e22f740> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e230110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2312b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e233d70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e373170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e232030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23bc50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23a720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23a480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e23a9f0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2324b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e27ff50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e280050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e281b20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e281910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e283f80> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e282180> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2876b0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e283ef0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e288770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e2888f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e2889b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e280230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e28bfe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e115190> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e28a7b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e28bb60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e28a420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e1192b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11a090> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e2884a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11a1e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11b290> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8e125dc0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e123e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e20e6c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e3ee390> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e125d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e11b020> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b5b50> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd4fd10> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dd4ff80> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e19e6c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b66f0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b4230> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b7ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dd57020> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd568d0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dd56ab0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd55d00> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd571d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ddb5d00> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dd57ce0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8e1b7e90> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ddb5a60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8ddb6960> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8ddf20c0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dde1eb0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8de05e80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dde31d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd8dc07ad0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dc06e40> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd8dc043e0> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "03", "second": "09", "epoch": "1727096589", "epoch_int": "1727096589", "date": "2024-09-23", "time": "09:03:09", "iso8601_micro": "2024-09-23T13:03:09.838437Z", "iso8601": "2024-09-23T13:03:09Z", "iso8601_basic": "20240923T090309838437", "iso8601_basic_short": "20240923T090309", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 32980 1727096589.88907: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096589.88910: _low_level_execute_command(): starting 32980 1727096589.88912: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096589.3942835-33061-50359637662218/ > /dev/null 2>&1 && sleep 0' 32980 1727096589.88915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096589.88917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096589.88939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.88943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096589.88955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.88999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.90878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096589.90882: stdout chunk (state=3): >>><<< 32980 1727096589.90884: stderr chunk (state=3): >>><<< 32980 1727096589.90899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096589.90934: handler run complete 32980 1727096589.90965: variable 'ansible_facts' from source: unknown 32980 1727096589.91024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.91149: variable 'ansible_facts' from source: unknown 32980 1727096589.91187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.91246: attempt loop complete, returning result 32980 1727096589.91272: _execute() done 32980 1727096589.91278: dumping result to json 32980 1727096589.91287: done dumping result, returning 32980 1727096589.91367: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-457d-ef33-0000000000c0] 32980 1727096589.91375: sending task result for task 0afff68d-5257-457d-ef33-0000000000c0 ok: [managed_node2] 32980 1727096589.91697: no more pending results, returning what we have 32980 1727096589.91700: results queue empty 32980 1727096589.91701: checking for any_errors_fatal 32980 1727096589.91702: done checking for any_errors_fatal 32980 1727096589.91703: checking for max_fail_percentage 32980 1727096589.91705: done checking for max_fail_percentage 32980 1727096589.91706: checking to see if all hosts have failed and the running result is not ok 32980 1727096589.91706: done checking to see if all hosts have failed 32980 1727096589.91707: getting the remaining hosts for this loop 32980 1727096589.91709: done getting the remaining hosts for this loop 32980 1727096589.91713: getting the next task for host managed_node2 32980 1727096589.91723: done getting next task for host managed_node2 32980 1727096589.91725: ^ task is: TASK: Check if system is ostree 32980 1727096589.91728: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096589.91732: getting variables 32980 1727096589.91734: in VariableManager get_vars() 32980 1727096589.91763: Calling all_inventory to load vars for managed_node2 32980 1727096589.91766: Calling groups_inventory to load vars for managed_node2 32980 1727096589.91976: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096589.91988: Calling all_plugins_play to load vars for managed_node2 32980 1727096589.91991: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096589.91994: Calling groups_plugins_play to load vars for managed_node2 32980 1727096589.92204: done sending task result for task 0afff68d-5257-457d-ef33-0000000000c0 32980 1727096589.92208: WORKER PROCESS EXITING 32980 1727096589.92221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096589.92411: done with get_vars() 32980 1727096589.92421: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 09:03:09 -0400 (0:00:00.596) 0:00:01.851 ****** 32980 1727096589.92515: entering _queue_task() for managed_node2/stat 32980 1727096589.92772: worker is 1 (out of 1 available) 32980 1727096589.92787: exiting _queue_task() for managed_node2/stat 32980 1727096589.92798: done queuing things up, now waiting for results queue to drain 32980 1727096589.92799: waiting for pending results... 32980 1727096589.93032: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 32980 1727096589.93136: in run() - task 0afff68d-5257-457d-ef33-0000000000c2 32980 1727096589.93158: variable 'ansible_search_path' from source: unknown 32980 1727096589.93166: variable 'ansible_search_path' from source: unknown 32980 1727096589.93206: calling self._execute() 32980 1727096589.93285: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096589.93297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096589.93312: variable 'omit' from source: magic vars 32980 1727096589.93781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096589.94032: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096589.94082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096589.94120: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096589.94191: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096589.94290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096589.94322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096589.94360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096589.94398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096589.94527: Evaluated conditional (not __network_is_ostree is defined): True 32980 1727096589.94538: variable 'omit' from source: magic vars 32980 1727096589.94587: variable 'omit' from source: magic vars 32980 1727096589.94626: variable 'omit' from source: magic vars 32980 1727096589.94658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096589.94697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096589.94721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096589.94743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096589.94758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096589.94800: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096589.94808: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096589.94817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096589.94926: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096589.94937: Set connection var ansible_timeout to 10 32980 1727096589.94945: Set connection var ansible_shell_type to sh 32980 1727096589.94952: Set connection var ansible_connection to ssh 32980 1727096589.94965: Set connection var ansible_shell_executable to /bin/sh 32980 1727096589.94984: Set connection var ansible_pipelining to False 32980 1727096589.95014: variable 'ansible_shell_executable' from source: unknown 32980 1727096589.95022: variable 'ansible_connection' from source: unknown 32980 1727096589.95031: variable 'ansible_module_compression' from source: unknown 32980 1727096589.95037: variable 'ansible_shell_type' from source: unknown 32980 1727096589.95044: variable 'ansible_shell_executable' from source: unknown 32980 1727096589.95053: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096589.95061: variable 'ansible_pipelining' from source: unknown 32980 1727096589.95070: variable 'ansible_timeout' from source: unknown 32980 1727096589.95107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096589.95236: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096589.95252: variable 'omit' from source: magic vars 32980 1727096589.95261: starting attempt loop 32980 1727096589.95326: running the handler 32980 1727096589.95330: _low_level_execute_command(): starting 32980 1727096589.95332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096589.96015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096589.96030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096589.96047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096589.96065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096589.96092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096589.96157: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.96235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096589.96275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.96345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096589.98003: stdout chunk (state=3): >>>/root <<< 32980 1727096589.98103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096589.98131: stderr chunk (state=3): >>><<< 32980 1727096589.98134: stdout chunk (state=3): >>><<< 32980 1727096589.98159: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096589.98172: _low_level_execute_command(): starting 32980 1727096589.98175: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794 `" && echo ansible-tmp-1727096589.9815392-33088-64448959661794="` echo /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794 `" ) && sleep 0' 32980 1727096589.98879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096589.98883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096589.98885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096589.98887: stderr chunk (state=3): >>>debug2: match found <<< 32980 1727096589.98889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096589.98910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096589.98921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096589.98939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096589.98995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.00942: stdout chunk (state=3): >>>ansible-tmp-1727096589.9815392-33088-64448959661794=/root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794 <<< 32980 1727096590.01047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.01078: stderr chunk (state=3): >>><<< 32980 1727096590.01081: stdout chunk (state=3): >>><<< 32980 1727096590.01095: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096589.9815392-33088-64448959661794=/root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096590.01136: variable 'ansible_module_compression' from source: unknown 32980 1727096590.01205: ANSIBALLZ: Using lock for stat 32980 1727096590.01208: ANSIBALLZ: Acquiring lock 32980 1727096590.01210: ANSIBALLZ: Lock acquired: 140258569803568 32980 1727096590.01213: ANSIBALLZ: Creating module 32980 1727096590.10228: ANSIBALLZ: Writing module into payload 32980 1727096590.10292: ANSIBALLZ: Writing module 32980 1727096590.10308: ANSIBALLZ: Renaming module 32980 1727096590.10314: ANSIBALLZ: Done creating module 32980 1727096590.10329: variable 'ansible_facts' from source: unknown 32980 1727096590.10373: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/AnsiballZ_stat.py 32980 1727096590.10471: Sending initial data 32980 1727096590.10478: Sent initial data (152 bytes) 32980 1727096590.10884: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096590.10888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.10902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.10950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.10953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.11002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.12642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 32980 1727096590.12646: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096590.12679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096590.12752: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpabxboik1 /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/AnsiballZ_stat.py <<< 32980 1727096590.12755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/AnsiballZ_stat.py" <<< 32980 1727096590.12790: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpabxboik1" to remote "/root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/AnsiballZ_stat.py" <<< 32980 1727096590.13564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.13616: stderr chunk (state=3): >>><<< 32980 1727096590.13620: stdout chunk (state=3): >>><<< 32980 1727096590.13779: done transferring module to remote 32980 1727096590.13792: _low_level_execute_command(): starting 32980 1727096590.13796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/ /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/AnsiballZ_stat.py && sleep 0' 32980 1727096590.14487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.14505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.14518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096590.14528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.14581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.16331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.16366: stderr chunk (state=3): >>><<< 32980 1727096590.16371: stdout chunk (state=3): >>><<< 32980 1727096590.16411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096590.16414: _low_level_execute_command(): starting 32980 1727096590.16416: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/AnsiballZ_stat.py && sleep 0' 32980 1727096590.16898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096590.16901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096590.16903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096590.16905: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096590.16908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.16955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.16958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.17002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.19133: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 32980 1727096590.19160: stdout chunk (state=3): >>>import _imp # builtin <<< 32980 1727096590.19204: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 32980 1727096590.19267: stdout chunk (state=3): >>>import '_io' # <<< 32980 1727096590.19292: stdout chunk (state=3): >>>import 'marshal' # <<< 32980 1727096590.19338: stdout chunk (state=3): >>>import 'posix' # <<< 32980 1727096590.19341: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 32980 1727096590.19344: stdout chunk (state=3): >>># installing zipimport hook <<< 32980 1727096590.19378: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 32980 1727096590.19427: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.19469: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 32980 1727096590.19504: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 32980 1727096590.19535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 32980 1727096590.19559: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968d8bb00> <<< 32980 1727096590.19583: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 32980 1727096590.19601: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dbea50> <<< 32980 1727096590.19611: stdout chunk (state=3): >>>import '_signal' # <<< 32980 1727096590.19646: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 32980 1727096590.19678: stdout chunk (state=3): >>>import 'io' # <<< 32980 1727096590.19699: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 32980 1727096590.19789: stdout chunk (state=3): >>>import '_collections_abc' # <<< 32980 1727096590.19817: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 32980 1727096590.19860: stdout chunk (state=3): >>>import 'os' # <<< 32980 1727096590.19891: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 32980 1727096590.19912: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 32980 1727096590.19915: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 32980 1727096590.19952: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 32980 1727096590.19971: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dcd130> <<< 32980 1727096590.20024: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 32980 1727096590.20040: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.20064: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dcdfa0> import 'site' # <<< 32980 1727096590.20106: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 32980 1727096590.20336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 32980 1727096590.20348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 32980 1727096590.20375: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.20390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 32980 1727096590.20432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 32980 1727096590.20476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 32980 1727096590.20518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bcbe60> <<< 32980 1727096590.20524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 32980 1727096590.20542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 32980 1727096590.20574: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bcbef0> <<< 32980 1727096590.20595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 32980 1727096590.20615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 32980 1727096590.20640: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 32980 1727096590.20679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.20694: stdout chunk (state=3): >>>import 'itertools' # <<< 32980 1727096590.20730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 32980 1727096590.20757: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c03860> <<< 32980 1727096590.20764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c03ef0> <<< 32980 1727096590.20796: stdout chunk (state=3): >>>import '_collections' # <<< 32980 1727096590.20837: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968be3b30> <<< 32980 1727096590.20876: stdout chunk (state=3): >>>import '_functools' # <<< 32980 1727096590.20892: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968be1220> <<< 32980 1727096590.20963: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bc9010> <<< 32980 1727096590.20994: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 32980 1727096590.21032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 32980 1727096590.21064: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 32980 1727096590.21201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 32980 1727096590.21254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c237a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c223c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968be20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bca8d0> <<< 32980 1727096590.21288: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c587d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bc8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 32980 1727096590.21401: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c58c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c58b30> <<< 32980 1727096590.21416: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c58f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bc6db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 32980 1727096590.21554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 32980 1727096590.21588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c595e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c592b0> import 'importlib.machinery' # <<< 32980 1727096590.21616: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c5a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 32980 1727096590.21817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 32980 1727096590.21822: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c706b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c71d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 32980 1727096590.21858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c72c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c73260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c72150> <<< 32980 1727096590.21918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 32980 1727096590.21946: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096590.21960: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c73ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c73410> <<< 32980 1727096590.22056: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c5a420> <<< 32980 1727096590.22072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 32980 1727096590.22075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 32980 1727096590.22172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 32980 1727096590.22176: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39689efc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 32980 1727096590.22206: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a187a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a18500> <<< 32980 1727096590.22241: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a186b0> <<< 32980 1727096590.22256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 32980 1727096590.22322: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096590.22450: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a19040> <<< 32980 1727096590.22606: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a19a30> <<< 32980 1727096590.22648: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a188f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39689eddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 32980 1727096590.22670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 32980 1727096590.22722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 32980 1727096590.22727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a1adb0> <<< 32980 1727096590.22754: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a18ec0> <<< 32980 1727096590.22769: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c5abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 32980 1727096590.22852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.22855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 32980 1727096590.23096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a470b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 32980 1727096590.23101: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a6b440> <<< 32980 1727096590.23193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 32980 1727096590.23197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 32980 1727096590.23200: stdout chunk (state=3): >>>import 'ntpath' # <<< 32980 1727096590.23327: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968ac81d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 32980 1727096590.23359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 32980 1727096590.23436: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968aca930> <<< 32980 1727096590.23488: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968ac82f0> <<< 32980 1727096590.23537: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a951f0> <<< 32980 1727096590.23586: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a959a0> <<< 32980 1727096590.23613: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a6a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a1bcb0> <<< 32980 1727096590.23709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 32980 1727096590.23757: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3968a6a5a0> <<< 32980 1727096590.23866: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_l5bjfxxq/ansible_stat_payload.zip' # zipimport: zlib available <<< 32980 1727096590.24007: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.24043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 32980 1727096590.24121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 32980 1727096590.24206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396837afc0> <<< 32980 1727096590.24243: stdout chunk (state=3): >>>import '_typing' # <<< 32980 1727096590.24444: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968359eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968359070> # zipimport: zlib available import 'ansible' # <<< 32980 1727096590.24477: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32980 1727096590.24526: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 32980 1727096590.24530: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.25900: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.27014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 32980 1727096590.27020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968378e90> <<< 32980 1727096590.27042: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 32980 1727096590.27048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.27098: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 32980 1727096590.27123: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 32980 1727096590.27149: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39683aa960> <<< 32980 1727096590.27184: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683aa6f0> <<< 32980 1727096590.27239: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683aa030> <<< 32980 1727096590.27257: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 32980 1727096590.27285: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683aa480> <<< 32980 1727096590.27293: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396837b9e0> <<< 32980 1727096590.27303: stdout chunk (state=3): >>>import 'atexit' # <<< 32980 1727096590.27328: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39683ab6e0> <<< 32980 1727096590.27375: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39683ab920> <<< 32980 1727096590.27392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32980 1727096590.27435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 32980 1727096590.27447: stdout chunk (state=3): >>>import '_locale' # <<< 32980 1727096590.27499: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683abe60> <<< 32980 1727096590.27529: stdout chunk (state=3): >>>import 'pwd' # <<< 32980 1727096590.27537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 32980 1727096590.27548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32980 1727096590.27590: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396820dbe0> <<< 32980 1727096590.27618: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396820f800> <<< 32980 1727096590.27641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 32980 1727096590.27656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 32980 1727096590.27693: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968214200> <<< 32980 1727096590.27714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 32980 1727096590.27744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 32980 1727096590.27761: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682153a0> <<< 32980 1727096590.27785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32980 1727096590.27816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 32980 1727096590.27839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 32980 1727096590.27898: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968217e60> <<< 32980 1727096590.27937: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a47020> <<< 32980 1727096590.27958: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968216120> <<< 32980 1727096590.27980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 32980 1727096590.28005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 32980 1727096590.28029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 32980 1727096590.28054: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32980 1727096590.28083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 32980 1727096590.28111: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 32980 1727096590.28131: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821be00> <<< 32980 1727096590.28134: stdout chunk (state=3): >>>import '_tokenize' # <<< 32980 1727096590.28204: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821a8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821a630> <<< 32980 1727096590.28232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 32980 1727096590.28239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 32980 1727096590.28315: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821aba0> <<< 32980 1727096590.28340: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968216630> <<< 32980 1727096590.28370: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968263a40> <<< 32980 1727096590.28403: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682641d0> <<< 32980 1727096590.28430: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32980 1727096590.28444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 32980 1727096590.28465: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 32980 1727096590.28510: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968265c10> <<< 32980 1727096590.28519: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682659d0> <<< 32980 1727096590.28531: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 32980 1727096590.28643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 32980 1727096590.28722: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682681a0> <<< 32980 1727096590.28725: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968266300> <<< 32980 1727096590.28774: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 32980 1727096590.28809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.28830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 32980 1727096590.28863: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826b950> <<< 32980 1727096590.28972: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968268350> <<< 32980 1727096590.29036: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096590.29043: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826c770> <<< 32980 1727096590.29071: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826c950> <<< 32980 1727096590.29122: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096590.29125: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826cc50> <<< 32980 1727096590.29133: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682642f0> <<< 32980 1727096590.29162: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 32980 1727096590.29201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 32980 1727096590.29219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 32980 1727096590.29257: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32980 1727096590.29260: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682f83e0> <<< 32980 1727096590.29440: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682f98e0> <<< 32980 1727096590.29447: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826eb70> <<< 32980 1727096590.29516: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826e780> # zipimport: zlib available <<< 32980 1727096590.29519: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 32980 1727096590.29539: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.29624: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.29745: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 32980 1727096590.29773: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 32980 1727096590.29871: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.29993: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.30541: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.31116: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 32980 1727096590.31211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682fdaf0> <<< 32980 1727096590.31266: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 32980 1727096590.31301: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682fe8a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682f9a30> <<< 32980 1727096590.31370: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 32980 1727096590.31374: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.31486: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 32980 1727096590.31490: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.31554: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.31708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 32980 1727096590.31726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682fe5a0> <<< 32980 1727096590.31742: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.32208: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.32640: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.32712: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.32784: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 32980 1727096590.32787: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.32835: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.32869: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 32980 1727096590.32874: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.32937: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.33023: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32980 1727096590.33037: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.33054: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 32980 1727096590.33070: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.33102: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.33142: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 32980 1727096590.33147: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.33375: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.33599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32980 1727096590.33707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 32980 1727096590.33966: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682ffad0> <<< 32980 1727096590.33971: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 32980 1727096590.34015: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 32980 1727096590.34018: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.34052: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.34101: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.34150: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.34225: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 32980 1727096590.34259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 32980 1727096590.34345: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396810a600> <<< 32980 1727096590.34389: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39681053a0> <<< 32980 1727096590.34419: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 32980 1727096590.34489: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.34550: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.34651: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.34680: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 32980 1727096590.34765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 32980 1727096590.34771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 32980 1727096590.34790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32980 1727096590.34961: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683dee40> <<< 32980 1727096590.35066: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683eeb10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396810a6c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826d970> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 32980 1727096590.35097: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 32980 1727096590.35118: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 32980 1727096590.35252: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.35446: stdout chunk (state=3): >>># zipimport: zlib available <<< 32980 1727096590.35572: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 32980 1727096590.35944: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 32980 1727096590.35982: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants <<< 32980 1727096590.35996: stdout chunk (state=3): >>># cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse <<< 32980 1727096590.36027: stdout chunk (state=3): >>># destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 32980 1727096590.36103: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 32980 1727096590.36382: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 32980 1727096590.36401: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 32980 1727096590.36447: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal <<< 32980 1727096590.36460: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 32980 1727096590.36501: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux <<< 32980 1727096590.36519: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 32980 1727096590.36557: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 32980 1727096590.36581: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 32980 1727096590.36629: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 32980 1727096590.36633: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 32980 1727096590.36676: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases <<< 32980 1727096590.36700: stdout chunk (state=3): >>># cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 32980 1727096590.36713: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32980 1727096590.36827: stdout chunk (state=3): >>># destroy sys.monitoring <<< 32980 1727096590.36971: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 32980 1727096590.36985: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 32980 1727096590.37061: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 32980 1727096590.37069: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 32980 1727096590.37098: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 32980 1727096590.37405: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins <<< 32980 1727096590.37435: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 32980 1727096590.37555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096590.37558: stdout chunk (state=3): >>><<< 32980 1727096590.37561: stderr chunk (state=3): >>><<< 32980 1727096590.37838: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968d8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968dcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bcbe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bcbef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c03860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c03ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968be3b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968be1220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bc9010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c237a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c223c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968be20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bca8d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c587d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bc8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c58c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c58b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c58f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968bc6db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c595e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c592b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c5a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c706b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c71d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c72c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c73260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c72150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968c73ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c73410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c5a420> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39689efc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a187a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a18500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a186b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a19040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a19a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a188f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39689eddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a1adb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a18ec0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968c5abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a470b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a6b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968ac81d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968aca930> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968ac82f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a951f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a959a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a6a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968a1bcb0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3968a6a5a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_l5bjfxxq/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396837afc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968359eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968359070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968378e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39683aa960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683aa6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683aa030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683aa480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396837b9e0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39683ab6e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39683ab920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683abe60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396820dbe0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396820f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968214200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682153a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968217e60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968a47020> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968216120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821be00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821a8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821a630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396821aba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968216630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968263a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682641d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3968265c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682659d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682681a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968266300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826b950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3968268350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826c770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826c950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826cc50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682642f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682f83e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682f98e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826eb70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396826fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826e780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f39682fdaf0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682fe8a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682f9a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682fe5a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39682ffad0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f396810a600> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39681053a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683dee40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f39683eeb10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396810a6c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f396826d970> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 32980 1727096590.38939: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096590.38942: _low_level_execute_command(): starting 32980 1727096590.38945: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096589.9815392-33088-64448959661794/ > /dev/null 2>&1 && sleep 0' 32980 1727096590.39355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096590.39358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096590.39409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096590.39412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.39415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096590.39483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.39566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.39692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096590.39696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.39698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.41517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.41541: stderr chunk (state=3): >>><<< 32980 1727096590.41547: stdout chunk (state=3): >>><<< 32980 1727096590.41563: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096590.41575: handler run complete 32980 1727096590.41638: attempt loop complete, returning result 32980 1727096590.41641: _execute() done 32980 1727096590.41643: dumping result to json 32980 1727096590.41645: done dumping result, returning 32980 1727096590.41647: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0afff68d-5257-457d-ef33-0000000000c2] 32980 1727096590.41648: sending task result for task 0afff68d-5257-457d-ef33-0000000000c2 32980 1727096590.41714: done sending task result for task 0afff68d-5257-457d-ef33-0000000000c2 32980 1727096590.41716: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 32980 1727096590.41786: no more pending results, returning what we have 32980 1727096590.41789: results queue empty 32980 1727096590.41790: checking for any_errors_fatal 32980 1727096590.41797: done checking for any_errors_fatal 32980 1727096590.41798: checking for max_fail_percentage 32980 1727096590.41799: done checking for max_fail_percentage 32980 1727096590.41800: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.41801: done checking to see if all hosts have failed 32980 1727096590.41802: getting the remaining hosts for this loop 32980 1727096590.41803: done getting the remaining hosts for this loop 32980 1727096590.41806: getting the next task for host managed_node2 32980 1727096590.41812: done getting next task for host managed_node2 32980 1727096590.41814: ^ task is: TASK: Set flag to indicate system is ostree 32980 1727096590.41817: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.41821: getting variables 32980 1727096590.41822: in VariableManager get_vars() 32980 1727096590.41855: Calling all_inventory to load vars for managed_node2 32980 1727096590.41857: Calling groups_inventory to load vars for managed_node2 32980 1727096590.41860: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.41871: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.41876: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.41879: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.42029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.42146: done with get_vars() 32980 1727096590.42156: done getting variables 32980 1727096590.42227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 09:03:10 -0400 (0:00:00.497) 0:00:02.349 ****** 32980 1727096590.42247: entering _queue_task() for managed_node2/set_fact 32980 1727096590.42249: Creating lock for set_fact 32980 1727096590.42517: worker is 1 (out of 1 available) 32980 1727096590.42527: exiting _queue_task() for managed_node2/set_fact 32980 1727096590.42537: done queuing things up, now waiting for results queue to drain 32980 1727096590.42538: waiting for pending results... 32980 1727096590.42891: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 32980 1727096590.42896: in run() - task 0afff68d-5257-457d-ef33-0000000000c3 32980 1727096590.43004: variable 'ansible_search_path' from source: unknown 32980 1727096590.43007: variable 'ansible_search_path' from source: unknown 32980 1727096590.43010: calling self._execute() 32980 1727096590.43064: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.43072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.43081: variable 'omit' from source: magic vars 32980 1727096590.43634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096590.44060: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096590.44172: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096590.44178: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096590.44188: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096590.44275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096590.44310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096590.44345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096590.44376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096590.44511: Evaluated conditional (not __network_is_ostree is defined): True 32980 1727096590.44535: variable 'omit' from source: magic vars 32980 1727096590.44571: variable 'omit' from source: magic vars 32980 1727096590.44730: variable '__ostree_booted_stat' from source: set_fact 32980 1727096590.44761: variable 'omit' from source: magic vars 32980 1727096590.44793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096590.44825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096590.44862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096590.44945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096590.44949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096590.44951: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096590.44953: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.44956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.45038: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096590.45048: Set connection var ansible_timeout to 10 32980 1727096590.45055: Set connection var ansible_shell_type to sh 32980 1727096590.45061: Set connection var ansible_connection to ssh 32980 1727096590.45081: Set connection var ansible_shell_executable to /bin/sh 32980 1727096590.45089: Set connection var ansible_pipelining to False 32980 1727096590.45112: variable 'ansible_shell_executable' from source: unknown 32980 1727096590.45119: variable 'ansible_connection' from source: unknown 32980 1727096590.45172: variable 'ansible_module_compression' from source: unknown 32980 1727096590.45178: variable 'ansible_shell_type' from source: unknown 32980 1727096590.45180: variable 'ansible_shell_executable' from source: unknown 32980 1727096590.45182: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.45184: variable 'ansible_pipelining' from source: unknown 32980 1727096590.45186: variable 'ansible_timeout' from source: unknown 32980 1727096590.45188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.45306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096590.45309: variable 'omit' from source: magic vars 32980 1727096590.45311: starting attempt loop 32980 1727096590.45314: running the handler 32980 1727096590.45316: handler run complete 32980 1727096590.45318: attempt loop complete, returning result 32980 1727096590.45320: _execute() done 32980 1727096590.45322: dumping result to json 32980 1727096590.45324: done dumping result, returning 32980 1727096590.45333: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0afff68d-5257-457d-ef33-0000000000c3] 32980 1727096590.45343: sending task result for task 0afff68d-5257-457d-ef33-0000000000c3 ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 32980 1727096590.45521: no more pending results, returning what we have 32980 1727096590.45524: results queue empty 32980 1727096590.45525: checking for any_errors_fatal 32980 1727096590.45531: done checking for any_errors_fatal 32980 1727096590.45532: checking for max_fail_percentage 32980 1727096590.45534: done checking for max_fail_percentage 32980 1727096590.45534: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.45535: done checking to see if all hosts have failed 32980 1727096590.45536: getting the remaining hosts for this loop 32980 1727096590.45537: done getting the remaining hosts for this loop 32980 1727096590.45540: getting the next task for host managed_node2 32980 1727096590.45549: done getting next task for host managed_node2 32980 1727096590.45551: ^ task is: TASK: Fix CentOS6 Base repo 32980 1727096590.45554: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.45558: getting variables 32980 1727096590.45559: in VariableManager get_vars() 32980 1727096590.45594: Calling all_inventory to load vars for managed_node2 32980 1727096590.45597: Calling groups_inventory to load vars for managed_node2 32980 1727096590.45600: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.45611: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.45613: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.45617: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.45901: done sending task result for task 0afff68d-5257-457d-ef33-0000000000c3 32980 1727096590.45910: WORKER PROCESS EXITING 32980 1727096590.45933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.46162: done with get_vars() 32980 1727096590.46262: done getting variables 32980 1727096590.46490: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 09:03:10 -0400 (0:00:00.042) 0:00:02.391 ****** 32980 1727096590.46515: entering _queue_task() for managed_node2/copy 32980 1727096590.46952: worker is 1 (out of 1 available) 32980 1727096590.46963: exiting _queue_task() for managed_node2/copy 32980 1727096590.46976: done queuing things up, now waiting for results queue to drain 32980 1727096590.46978: waiting for pending results... 32980 1727096590.47510: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 32980 1727096590.47591: in run() - task 0afff68d-5257-457d-ef33-0000000000c5 32980 1727096590.47603: variable 'ansible_search_path' from source: unknown 32980 1727096590.47607: variable 'ansible_search_path' from source: unknown 32980 1727096590.47686: calling self._execute() 32980 1727096590.47816: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.47820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.47831: variable 'omit' from source: magic vars 32980 1727096590.48682: variable 'ansible_distribution' from source: facts 32980 1727096590.48685: Evaluated conditional (ansible_distribution == 'CentOS'): True 32980 1727096590.48806: variable 'ansible_distribution_major_version' from source: facts 32980 1727096590.48810: Evaluated conditional (ansible_distribution_major_version == '6'): False 32980 1727096590.48815: when evaluation is False, skipping this task 32980 1727096590.48818: _execute() done 32980 1727096590.48820: dumping result to json 32980 1727096590.48822: done dumping result, returning 32980 1727096590.48829: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0afff68d-5257-457d-ef33-0000000000c5] 32980 1727096590.48834: sending task result for task 0afff68d-5257-457d-ef33-0000000000c5 32980 1727096590.48946: done sending task result for task 0afff68d-5257-457d-ef33-0000000000c5 32980 1727096590.48949: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 32980 1727096590.49027: no more pending results, returning what we have 32980 1727096590.49029: results queue empty 32980 1727096590.49030: checking for any_errors_fatal 32980 1727096590.49033: done checking for any_errors_fatal 32980 1727096590.49034: checking for max_fail_percentage 32980 1727096590.49036: done checking for max_fail_percentage 32980 1727096590.49036: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.49037: done checking to see if all hosts have failed 32980 1727096590.49038: getting the remaining hosts for this loop 32980 1727096590.49039: done getting the remaining hosts for this loop 32980 1727096590.49042: getting the next task for host managed_node2 32980 1727096590.49047: done getting next task for host managed_node2 32980 1727096590.49050: ^ task is: TASK: Include the task 'enable_epel.yml' 32980 1727096590.49053: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.49056: getting variables 32980 1727096590.49057: in VariableManager get_vars() 32980 1727096590.49080: Calling all_inventory to load vars for managed_node2 32980 1727096590.49082: Calling groups_inventory to load vars for managed_node2 32980 1727096590.49085: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.49094: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.49097: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.49100: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.49276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.49484: done with get_vars() 32980 1727096590.49493: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 09:03:10 -0400 (0:00:00.030) 0:00:02.422 ****** 32980 1727096590.49584: entering _queue_task() for managed_node2/include_tasks 32980 1727096590.49815: worker is 1 (out of 1 available) 32980 1727096590.49825: exiting _queue_task() for managed_node2/include_tasks 32980 1727096590.49836: done queuing things up, now waiting for results queue to drain 32980 1727096590.49837: waiting for pending results... 32980 1727096590.50439: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 32980 1727096590.50444: in run() - task 0afff68d-5257-457d-ef33-0000000000c6 32980 1727096590.50446: variable 'ansible_search_path' from source: unknown 32980 1727096590.50452: variable 'ansible_search_path' from source: unknown 32980 1727096590.50455: calling self._execute() 32980 1727096590.50627: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.50631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.50717: variable 'omit' from source: magic vars 32980 1727096590.51246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096590.54162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096590.54166: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096590.54170: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096590.54207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096590.54241: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096590.54324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096590.54357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096590.54391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096590.54434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096590.54451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096590.54566: variable '__network_is_ostree' from source: set_fact 32980 1727096590.54595: Evaluated conditional (not __network_is_ostree | d(false)): True 32980 1727096590.54604: _execute() done 32980 1727096590.54610: dumping result to json 32980 1727096590.54619: done dumping result, returning 32980 1727096590.54628: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-457d-ef33-0000000000c6] 32980 1727096590.54635: sending task result for task 0afff68d-5257-457d-ef33-0000000000c6 32980 1727096590.54912: done sending task result for task 0afff68d-5257-457d-ef33-0000000000c6 32980 1727096590.54916: WORKER PROCESS EXITING 32980 1727096590.54940: no more pending results, returning what we have 32980 1727096590.54944: in VariableManager get_vars() 32980 1727096590.54975: Calling all_inventory to load vars for managed_node2 32980 1727096590.54977: Calling groups_inventory to load vars for managed_node2 32980 1727096590.54980: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.54989: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.54991: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.54994: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.55201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.55389: done with get_vars() 32980 1727096590.55396: variable 'ansible_search_path' from source: unknown 32980 1727096590.55397: variable 'ansible_search_path' from source: unknown 32980 1727096590.55431: we have included files to process 32980 1727096590.55432: generating all_blocks data 32980 1727096590.55434: done generating all_blocks data 32980 1727096590.55438: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32980 1727096590.55440: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32980 1727096590.55442: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32980 1727096590.56286: done processing included file 32980 1727096590.56289: iterating over new_blocks loaded from include file 32980 1727096590.56290: in VariableManager get_vars() 32980 1727096590.56302: done with get_vars() 32980 1727096590.56304: filtering new block on tags 32980 1727096590.56326: done filtering new block on tags 32980 1727096590.56329: in VariableManager get_vars() 32980 1727096590.56349: done with get_vars() 32980 1727096590.56351: filtering new block on tags 32980 1727096590.56362: done filtering new block on tags 32980 1727096590.56364: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 32980 1727096590.56372: extending task lists for all hosts with included blocks 32980 1727096590.56477: done extending task lists 32980 1727096590.56478: done processing included files 32980 1727096590.56479: results queue empty 32980 1727096590.56480: checking for any_errors_fatal 32980 1727096590.56483: done checking for any_errors_fatal 32980 1727096590.56484: checking for max_fail_percentage 32980 1727096590.56485: done checking for max_fail_percentage 32980 1727096590.56485: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.56486: done checking to see if all hosts have failed 32980 1727096590.56487: getting the remaining hosts for this loop 32980 1727096590.56488: done getting the remaining hosts for this loop 32980 1727096590.56490: getting the next task for host managed_node2 32980 1727096590.56495: done getting next task for host managed_node2 32980 1727096590.56497: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 32980 1727096590.56500: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.56502: getting variables 32980 1727096590.56503: in VariableManager get_vars() 32980 1727096590.56511: Calling all_inventory to load vars for managed_node2 32980 1727096590.56513: Calling groups_inventory to load vars for managed_node2 32980 1727096590.56515: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.56520: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.56528: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.56531: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.56705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.56918: done with get_vars() 32980 1727096590.56926: done getting variables 32980 1727096590.57001: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 32980 1727096590.57197: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 09:03:10 -0400 (0:00:00.076) 0:00:02.499 ****** 32980 1727096590.57249: entering _queue_task() for managed_node2/command 32980 1727096590.57251: Creating lock for command 32980 1727096590.57662: worker is 1 (out of 1 available) 32980 1727096590.57680: exiting _queue_task() for managed_node2/command 32980 1727096590.57689: done queuing things up, now waiting for results queue to drain 32980 1727096590.57691: waiting for pending results... 32980 1727096590.57885: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 32980 1727096590.57976: in run() - task 0afff68d-5257-457d-ef33-0000000000e0 32980 1727096590.57997: variable 'ansible_search_path' from source: unknown 32980 1727096590.58008: variable 'ansible_search_path' from source: unknown 32980 1727096590.58051: calling self._execute() 32980 1727096590.58136: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.58149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.58164: variable 'omit' from source: magic vars 32980 1727096590.58587: variable 'ansible_distribution' from source: facts 32980 1727096590.58602: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32980 1727096590.58903: variable 'ansible_distribution_major_version' from source: facts 32980 1727096590.58984: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32980 1727096590.58987: when evaluation is False, skipping this task 32980 1727096590.58989: _execute() done 32980 1727096590.58991: dumping result to json 32980 1727096590.58993: done dumping result, returning 32980 1727096590.58995: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0afff68d-5257-457d-ef33-0000000000e0] 32980 1727096590.58996: sending task result for task 0afff68d-5257-457d-ef33-0000000000e0 32980 1727096590.59070: done sending task result for task 0afff68d-5257-457d-ef33-0000000000e0 32980 1727096590.59074: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32980 1727096590.59126: no more pending results, returning what we have 32980 1727096590.59129: results queue empty 32980 1727096590.59130: checking for any_errors_fatal 32980 1727096590.59132: done checking for any_errors_fatal 32980 1727096590.59132: checking for max_fail_percentage 32980 1727096590.59134: done checking for max_fail_percentage 32980 1727096590.59135: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.59136: done checking to see if all hosts have failed 32980 1727096590.59137: getting the remaining hosts for this loop 32980 1727096590.59138: done getting the remaining hosts for this loop 32980 1727096590.59142: getting the next task for host managed_node2 32980 1727096590.59149: done getting next task for host managed_node2 32980 1727096590.59152: ^ task is: TASK: Install yum-utils package 32980 1727096590.59155: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.59160: getting variables 32980 1727096590.59162: in VariableManager get_vars() 32980 1727096590.59192: Calling all_inventory to load vars for managed_node2 32980 1727096590.59195: Calling groups_inventory to load vars for managed_node2 32980 1727096590.59198: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.59210: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.59213: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.59215: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.59613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.59809: done with get_vars() 32980 1727096590.59822: done getting variables 32980 1727096590.59910: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 09:03:10 -0400 (0:00:00.026) 0:00:02.526 ****** 32980 1727096590.59950: entering _queue_task() for managed_node2/package 32980 1727096590.59952: Creating lock for package 32980 1727096590.60226: worker is 1 (out of 1 available) 32980 1727096590.60237: exiting _queue_task() for managed_node2/package 32980 1727096590.60369: done queuing things up, now waiting for results queue to drain 32980 1727096590.60374: waiting for pending results... 32980 1727096590.60614: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 32980 1727096590.60695: in run() - task 0afff68d-5257-457d-ef33-0000000000e1 32980 1727096590.60801: variable 'ansible_search_path' from source: unknown 32980 1727096590.60805: variable 'ansible_search_path' from source: unknown 32980 1727096590.60808: calling self._execute() 32980 1727096590.60849: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.60862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.60879: variable 'omit' from source: magic vars 32980 1727096590.61298: variable 'ansible_distribution' from source: facts 32980 1727096590.61321: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32980 1727096590.61498: variable 'ansible_distribution_major_version' from source: facts 32980 1727096590.61510: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32980 1727096590.61520: when evaluation is False, skipping this task 32980 1727096590.61564: _execute() done 32980 1727096590.61568: dumping result to json 32980 1727096590.61571: done dumping result, returning 32980 1727096590.61579: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0afff68d-5257-457d-ef33-0000000000e1] 32980 1727096590.61583: sending task result for task 0afff68d-5257-457d-ef33-0000000000e1 32980 1727096590.61784: done sending task result for task 0afff68d-5257-457d-ef33-0000000000e1 32980 1727096590.61788: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32980 1727096590.61837: no more pending results, returning what we have 32980 1727096590.61841: results queue empty 32980 1727096590.61842: checking for any_errors_fatal 32980 1727096590.61850: done checking for any_errors_fatal 32980 1727096590.61851: checking for max_fail_percentage 32980 1727096590.61852: done checking for max_fail_percentage 32980 1727096590.61853: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.61854: done checking to see if all hosts have failed 32980 1727096590.61854: getting the remaining hosts for this loop 32980 1727096590.61856: done getting the remaining hosts for this loop 32980 1727096590.61861: getting the next task for host managed_node2 32980 1727096590.61875: done getting next task for host managed_node2 32980 1727096590.61878: ^ task is: TASK: Enable EPEL 7 32980 1727096590.61882: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.62002: getting variables 32980 1727096590.62004: in VariableManager get_vars() 32980 1727096590.62033: Calling all_inventory to load vars for managed_node2 32980 1727096590.62036: Calling groups_inventory to load vars for managed_node2 32980 1727096590.62039: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.62049: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.62052: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.62055: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.62274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.62510: done with get_vars() 32980 1727096590.62519: done getting variables 32980 1727096590.62585: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 09:03:10 -0400 (0:00:00.026) 0:00:02.552 ****** 32980 1727096590.62623: entering _queue_task() for managed_node2/command 32980 1727096590.62992: worker is 1 (out of 1 available) 32980 1727096590.63004: exiting _queue_task() for managed_node2/command 32980 1727096590.63012: done queuing things up, now waiting for results queue to drain 32980 1727096590.63014: waiting for pending results... 32980 1727096590.63217: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 32980 1727096590.63305: in run() - task 0afff68d-5257-457d-ef33-0000000000e2 32980 1727096590.63332: variable 'ansible_search_path' from source: unknown 32980 1727096590.63346: variable 'ansible_search_path' from source: unknown 32980 1727096590.63425: calling self._execute() 32980 1727096590.63480: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.63493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.63507: variable 'omit' from source: magic vars 32980 1727096590.63990: variable 'ansible_distribution' from source: facts 32980 1727096590.64011: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32980 1727096590.64167: variable 'ansible_distribution_major_version' from source: facts 32980 1727096590.64191: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32980 1727096590.64194: when evaluation is False, skipping this task 32980 1727096590.64200: _execute() done 32980 1727096590.64215: dumping result to json 32980 1727096590.64218: done dumping result, returning 32980 1727096590.64299: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0afff68d-5257-457d-ef33-0000000000e2] 32980 1727096590.64303: sending task result for task 0afff68d-5257-457d-ef33-0000000000e2 32980 1727096590.64366: done sending task result for task 0afff68d-5257-457d-ef33-0000000000e2 32980 1727096590.64371: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32980 1727096590.64489: no more pending results, returning what we have 32980 1727096590.64492: results queue empty 32980 1727096590.64493: checking for any_errors_fatal 32980 1727096590.64498: done checking for any_errors_fatal 32980 1727096590.64499: checking for max_fail_percentage 32980 1727096590.64500: done checking for max_fail_percentage 32980 1727096590.64501: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.64502: done checking to see if all hosts have failed 32980 1727096590.64503: getting the remaining hosts for this loop 32980 1727096590.64504: done getting the remaining hosts for this loop 32980 1727096590.64508: getting the next task for host managed_node2 32980 1727096590.64522: done getting next task for host managed_node2 32980 1727096590.64524: ^ task is: TASK: Enable EPEL 8 32980 1727096590.64528: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.64532: getting variables 32980 1727096590.64534: in VariableManager get_vars() 32980 1727096590.64565: Calling all_inventory to load vars for managed_node2 32980 1727096590.64576: Calling groups_inventory to load vars for managed_node2 32980 1727096590.64582: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.64595: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.64598: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.64602: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.64931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.65155: done with get_vars() 32980 1727096590.65163: done getting variables 32980 1727096590.65229: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 09:03:10 -0400 (0:00:00.026) 0:00:02.579 ****** 32980 1727096590.65252: entering _queue_task() for managed_node2/command 32980 1727096590.65553: worker is 1 (out of 1 available) 32980 1727096590.65562: exiting _queue_task() for managed_node2/command 32980 1727096590.65576: done queuing things up, now waiting for results queue to drain 32980 1727096590.65578: waiting for pending results... 32980 1727096590.65866: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 32980 1727096590.65877: in run() - task 0afff68d-5257-457d-ef33-0000000000e3 32980 1727096590.65884: variable 'ansible_search_path' from source: unknown 32980 1727096590.65887: variable 'ansible_search_path' from source: unknown 32980 1727096590.65889: calling self._execute() 32980 1727096590.65956: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.65970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.65985: variable 'omit' from source: magic vars 32980 1727096590.66371: variable 'ansible_distribution' from source: facts 32980 1727096590.66390: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32980 1727096590.66556: variable 'ansible_distribution_major_version' from source: facts 32980 1727096590.66646: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32980 1727096590.66650: when evaluation is False, skipping this task 32980 1727096590.66653: _execute() done 32980 1727096590.66655: dumping result to json 32980 1727096590.66657: done dumping result, returning 32980 1727096590.66659: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0afff68d-5257-457d-ef33-0000000000e3] 32980 1727096590.66662: sending task result for task 0afff68d-5257-457d-ef33-0000000000e3 32980 1727096590.66729: done sending task result for task 0afff68d-5257-457d-ef33-0000000000e3 32980 1727096590.66732: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32980 1727096590.66794: no more pending results, returning what we have 32980 1727096590.66798: results queue empty 32980 1727096590.66798: checking for any_errors_fatal 32980 1727096590.66803: done checking for any_errors_fatal 32980 1727096590.66804: checking for max_fail_percentage 32980 1727096590.66805: done checking for max_fail_percentage 32980 1727096590.66806: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.66807: done checking to see if all hosts have failed 32980 1727096590.66807: getting the remaining hosts for this loop 32980 1727096590.66809: done getting the remaining hosts for this loop 32980 1727096590.66812: getting the next task for host managed_node2 32980 1727096590.66821: done getting next task for host managed_node2 32980 1727096590.66823: ^ task is: TASK: Enable EPEL 6 32980 1727096590.66827: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.66831: getting variables 32980 1727096590.66832: in VariableManager get_vars() 32980 1727096590.66933: Calling all_inventory to load vars for managed_node2 32980 1727096590.66936: Calling groups_inventory to load vars for managed_node2 32980 1727096590.66939: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.66947: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.66950: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.66953: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.67123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.67537: done with get_vars() 32980 1727096590.67546: done getting variables 32980 1727096590.67748: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 09:03:10 -0400 (0:00:00.025) 0:00:02.605 ****** 32980 1727096590.67883: entering _queue_task() for managed_node2/copy 32980 1727096590.68378: worker is 1 (out of 1 available) 32980 1727096590.68390: exiting _queue_task() for managed_node2/copy 32980 1727096590.68399: done queuing things up, now waiting for results queue to drain 32980 1727096590.68401: waiting for pending results... 32980 1727096590.68616: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 32980 1727096590.68656: in run() - task 0afff68d-5257-457d-ef33-0000000000e5 32980 1727096590.68675: variable 'ansible_search_path' from source: unknown 32980 1727096590.68684: variable 'ansible_search_path' from source: unknown 32980 1727096590.68736: calling self._execute() 32980 1727096590.68829: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.68941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.68945: variable 'omit' from source: magic vars 32980 1727096590.69357: variable 'ansible_distribution' from source: facts 32980 1727096590.69377: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32980 1727096590.69500: variable 'ansible_distribution_major_version' from source: facts 32980 1727096590.69512: Evaluated conditional (ansible_distribution_major_version == '6'): False 32980 1727096590.69519: when evaluation is False, skipping this task 32980 1727096590.69531: _execute() done 32980 1727096590.69539: dumping result to json 32980 1727096590.69545: done dumping result, returning 32980 1727096590.69554: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0afff68d-5257-457d-ef33-0000000000e5] 32980 1727096590.69570: sending task result for task 0afff68d-5257-457d-ef33-0000000000e5 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 32980 1727096590.69817: no more pending results, returning what we have 32980 1727096590.69820: results queue empty 32980 1727096590.69821: checking for any_errors_fatal 32980 1727096590.69825: done checking for any_errors_fatal 32980 1727096590.69826: checking for max_fail_percentage 32980 1727096590.69827: done checking for max_fail_percentage 32980 1727096590.69828: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.69829: done checking to see if all hosts have failed 32980 1727096590.69830: getting the remaining hosts for this loop 32980 1727096590.69831: done getting the remaining hosts for this loop 32980 1727096590.69835: getting the next task for host managed_node2 32980 1727096590.69850: done getting next task for host managed_node2 32980 1727096590.69853: ^ task is: TASK: Set network provider to 'nm' 32980 1727096590.69856: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.69863: getting variables 32980 1727096590.69865: in VariableManager get_vars() 32980 1727096590.69900: Calling all_inventory to load vars for managed_node2 32980 1727096590.69905: Calling groups_inventory to load vars for managed_node2 32980 1727096590.69909: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.69922: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.69928: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.69931: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.70294: done sending task result for task 0afff68d-5257-457d-ef33-0000000000e5 32980 1727096590.70298: WORKER PROCESS EXITING 32980 1727096590.70319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.70522: done with get_vars() 32980 1727096590.70530: done getting variables 32980 1727096590.70585: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:13 Monday 23 September 2024 09:03:10 -0400 (0:00:00.027) 0:00:02.632 ****** 32980 1727096590.70608: entering _queue_task() for managed_node2/set_fact 32980 1727096590.70943: worker is 1 (out of 1 available) 32980 1727096590.70953: exiting _queue_task() for managed_node2/set_fact 32980 1727096590.70966: done queuing things up, now waiting for results queue to drain 32980 1727096590.70969: waiting for pending results... 32980 1727096590.71096: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 32980 1727096590.71185: in run() - task 0afff68d-5257-457d-ef33-000000000007 32980 1727096590.71205: variable 'ansible_search_path' from source: unknown 32980 1727096590.71240: calling self._execute() 32980 1727096590.71322: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.71332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.71346: variable 'omit' from source: magic vars 32980 1727096590.71457: variable 'omit' from source: magic vars 32980 1727096590.71505: variable 'omit' from source: magic vars 32980 1727096590.71555: variable 'omit' from source: magic vars 32980 1727096590.71613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096590.71659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096590.71696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096590.71732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096590.71751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096590.71795: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096590.71808: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.71828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.71959: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096590.71962: Set connection var ansible_timeout to 10 32980 1727096590.71964: Set connection var ansible_shell_type to sh 32980 1727096590.71966: Set connection var ansible_connection to ssh 32980 1727096590.72039: Set connection var ansible_shell_executable to /bin/sh 32980 1727096590.72042: Set connection var ansible_pipelining to False 32980 1727096590.72045: variable 'ansible_shell_executable' from source: unknown 32980 1727096590.72047: variable 'ansible_connection' from source: unknown 32980 1727096590.72049: variable 'ansible_module_compression' from source: unknown 32980 1727096590.72050: variable 'ansible_shell_type' from source: unknown 32980 1727096590.72052: variable 'ansible_shell_executable' from source: unknown 32980 1727096590.72054: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.72056: variable 'ansible_pipelining' from source: unknown 32980 1727096590.72058: variable 'ansible_timeout' from source: unknown 32980 1727096590.72060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.72210: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096590.72224: variable 'omit' from source: magic vars 32980 1727096590.72231: starting attempt loop 32980 1727096590.72236: running the handler 32980 1727096590.72248: handler run complete 32980 1727096590.72265: attempt loop complete, returning result 32980 1727096590.72273: _execute() done 32980 1727096590.72282: dumping result to json 32980 1727096590.72366: done dumping result, returning 32980 1727096590.72371: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0afff68d-5257-457d-ef33-000000000007] 32980 1727096590.72373: sending task result for task 0afff68d-5257-457d-ef33-000000000007 32980 1727096590.72440: done sending task result for task 0afff68d-5257-457d-ef33-000000000007 32980 1727096590.72443: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 32980 1727096590.72504: no more pending results, returning what we have 32980 1727096590.72507: results queue empty 32980 1727096590.72508: checking for any_errors_fatal 32980 1727096590.72513: done checking for any_errors_fatal 32980 1727096590.72514: checking for max_fail_percentage 32980 1727096590.72516: done checking for max_fail_percentage 32980 1727096590.72517: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.72518: done checking to see if all hosts have failed 32980 1727096590.72518: getting the remaining hosts for this loop 32980 1727096590.72520: done getting the remaining hosts for this loop 32980 1727096590.72524: getting the next task for host managed_node2 32980 1727096590.72532: done getting next task for host managed_node2 32980 1727096590.72534: ^ task is: TASK: meta (flush_handlers) 32980 1727096590.72535: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.72541: getting variables 32980 1727096590.72543: in VariableManager get_vars() 32980 1727096590.72739: Calling all_inventory to load vars for managed_node2 32980 1727096590.72742: Calling groups_inventory to load vars for managed_node2 32980 1727096590.72766: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.72778: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.72781: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.72784: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.72986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.73234: done with get_vars() 32980 1727096590.73246: done getting variables 32980 1727096590.73322: in VariableManager get_vars() 32980 1727096590.73333: Calling all_inventory to load vars for managed_node2 32980 1727096590.73336: Calling groups_inventory to load vars for managed_node2 32980 1727096590.73338: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.73342: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.73344: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.73347: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.73504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.73692: done with get_vars() 32980 1727096590.73705: done queuing things up, now waiting for results queue to drain 32980 1727096590.73707: results queue empty 32980 1727096590.73708: checking for any_errors_fatal 32980 1727096590.73715: done checking for any_errors_fatal 32980 1727096590.73716: checking for max_fail_percentage 32980 1727096590.73717: done checking for max_fail_percentage 32980 1727096590.73718: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.73719: done checking to see if all hosts have failed 32980 1727096590.73719: getting the remaining hosts for this loop 32980 1727096590.73720: done getting the remaining hosts for this loop 32980 1727096590.73723: getting the next task for host managed_node2 32980 1727096590.73734: done getting next task for host managed_node2 32980 1727096590.73736: ^ task is: TASK: meta (flush_handlers) 32980 1727096590.73737: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.73754: getting variables 32980 1727096590.73756: in VariableManager get_vars() 32980 1727096590.73764: Calling all_inventory to load vars for managed_node2 32980 1727096590.73766: Calling groups_inventory to load vars for managed_node2 32980 1727096590.73770: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.73774: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.73776: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.73779: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.73955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.74444: done with get_vars() 32980 1727096590.74451: done getting variables 32980 1727096590.74502: in VariableManager get_vars() 32980 1727096590.74516: Calling all_inventory to load vars for managed_node2 32980 1727096590.74519: Calling groups_inventory to load vars for managed_node2 32980 1727096590.74521: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.74525: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.74528: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.74530: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.74670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.74866: done with get_vars() 32980 1727096590.74882: done queuing things up, now waiting for results queue to drain 32980 1727096590.74884: results queue empty 32980 1727096590.74885: checking for any_errors_fatal 32980 1727096590.74886: done checking for any_errors_fatal 32980 1727096590.74887: checking for max_fail_percentage 32980 1727096590.74888: done checking for max_fail_percentage 32980 1727096590.74888: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.74889: done checking to see if all hosts have failed 32980 1727096590.74890: getting the remaining hosts for this loop 32980 1727096590.74891: done getting the remaining hosts for this loop 32980 1727096590.74893: getting the next task for host managed_node2 32980 1727096590.74896: done getting next task for host managed_node2 32980 1727096590.74896: ^ task is: None 32980 1727096590.74898: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.74899: done queuing things up, now waiting for results queue to drain 32980 1727096590.74900: results queue empty 32980 1727096590.74900: checking for any_errors_fatal 32980 1727096590.74901: done checking for any_errors_fatal 32980 1727096590.74902: checking for max_fail_percentage 32980 1727096590.74902: done checking for max_fail_percentage 32980 1727096590.74903: checking to see if all hosts have failed and the running result is not ok 32980 1727096590.74904: done checking to see if all hosts have failed 32980 1727096590.74905: getting the next task for host managed_node2 32980 1727096590.74908: done getting next task for host managed_node2 32980 1727096590.74908: ^ task is: None 32980 1727096590.74910: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.74952: in VariableManager get_vars() 32980 1727096590.74982: done with get_vars() 32980 1727096590.74990: in VariableManager get_vars() 32980 1727096590.75008: done with get_vars() 32980 1727096590.75017: variable 'omit' from source: magic vars 32980 1727096590.75054: in VariableManager get_vars() 32980 1727096590.75090: done with get_vars() 32980 1727096590.75112: variable 'omit' from source: magic vars PLAY [Play for testing vlan mtu setting] *************************************** 32980 1727096590.75526: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 32980 1727096590.75550: getting the remaining hosts for this loop 32980 1727096590.75551: done getting the remaining hosts for this loop 32980 1727096590.75554: getting the next task for host managed_node2 32980 1727096590.75556: done getting next task for host managed_node2 32980 1727096590.75558: ^ task is: TASK: Gathering Facts 32980 1727096590.75559: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096590.75561: getting variables 32980 1727096590.75562: in VariableManager get_vars() 32980 1727096590.75576: Calling all_inventory to load vars for managed_node2 32980 1727096590.75578: Calling groups_inventory to load vars for managed_node2 32980 1727096590.75580: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096590.75584: Calling all_plugins_play to load vars for managed_node2 32980 1727096590.75597: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096590.75600: Calling groups_plugins_play to load vars for managed_node2 32980 1727096590.75750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096590.75999: done with get_vars() 32980 1727096590.76007: done getting variables 32980 1727096590.76046: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Monday 23 September 2024 09:03:10 -0400 (0:00:00.054) 0:00:02.687 ****** 32980 1727096590.76077: entering _queue_task() for managed_node2/gather_facts 32980 1727096590.76490: worker is 1 (out of 1 available) 32980 1727096590.76498: exiting _queue_task() for managed_node2/gather_facts 32980 1727096590.76508: done queuing things up, now waiting for results queue to drain 32980 1727096590.76509: waiting for pending results... 32980 1727096590.76572: running TaskExecutor() for managed_node2/TASK: Gathering Facts 32980 1727096590.76737: in run() - task 0afff68d-5257-457d-ef33-00000000010b 32980 1727096590.76741: variable 'ansible_search_path' from source: unknown 32980 1727096590.76744: calling self._execute() 32980 1727096590.76806: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.76817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.76830: variable 'omit' from source: magic vars 32980 1727096590.77244: variable 'ansible_distribution_major_version' from source: facts 32980 1727096590.77259: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096590.77270: variable 'omit' from source: magic vars 32980 1727096590.77305: variable 'omit' from source: magic vars 32980 1727096590.77394: variable 'omit' from source: magic vars 32980 1727096590.77429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096590.77498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096590.77541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096590.77612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096590.77618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096590.77654: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096590.77675: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.77725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.77849: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096590.77861: Set connection var ansible_timeout to 10 32980 1727096590.77869: Set connection var ansible_shell_type to sh 32980 1727096590.77879: Set connection var ansible_connection to ssh 32980 1727096590.77958: Set connection var ansible_shell_executable to /bin/sh 32980 1727096590.77962: Set connection var ansible_pipelining to False 32980 1727096590.77964: variable 'ansible_shell_executable' from source: unknown 32980 1727096590.77966: variable 'ansible_connection' from source: unknown 32980 1727096590.77975: variable 'ansible_module_compression' from source: unknown 32980 1727096590.77980: variable 'ansible_shell_type' from source: unknown 32980 1727096590.77982: variable 'ansible_shell_executable' from source: unknown 32980 1727096590.77987: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096590.77991: variable 'ansible_pipelining' from source: unknown 32980 1727096590.77994: variable 'ansible_timeout' from source: unknown 32980 1727096590.77996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096590.78209: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096590.78229: variable 'omit' from source: magic vars 32980 1727096590.78288: starting attempt loop 32980 1727096590.78292: running the handler 32980 1727096590.78294: variable 'ansible_facts' from source: unknown 32980 1727096590.78309: _low_level_execute_command(): starting 32980 1727096590.78321: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096590.79164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096590.79213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.79332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.81003: stdout chunk (state=3): >>>/root <<< 32980 1727096590.81138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.81172: stdout chunk (state=3): >>><<< 32980 1727096590.81197: stderr chunk (state=3): >>><<< 32980 1727096590.81277: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096590.81281: _low_level_execute_command(): starting 32980 1727096590.81286: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568 `" && echo ansible-tmp-1727096590.812238-33133-248709685111568="` echo /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568 `" ) && sleep 0' 32980 1727096590.81889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096590.81904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096590.81927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096590.81948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096590.81969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096590.82041: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.82122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.82148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096590.82173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.82278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.84202: stdout chunk (state=3): >>>ansible-tmp-1727096590.812238-33133-248709685111568=/root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568 <<< 32980 1727096590.84313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.84476: stderr chunk (state=3): >>><<< 32980 1727096590.84480: stdout chunk (state=3): >>><<< 32980 1727096590.84483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096590.812238-33133-248709685111568=/root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096590.84490: variable 'ansible_module_compression' from source: unknown 32980 1727096590.84775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32980 1727096590.84781: variable 'ansible_facts' from source: unknown 32980 1727096590.85176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/AnsiballZ_setup.py 32980 1727096590.85590: Sending initial data 32980 1727096590.85623: Sent initial data (153 bytes) 32980 1727096590.87198: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096590.87211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.87254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.87281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.87362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.87560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.89184: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096590.89233: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096590.89288: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpyt9a62hp /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/AnsiballZ_setup.py <<< 32980 1727096590.89312: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/AnsiballZ_setup.py" <<< 32980 1727096590.89338: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpyt9a62hp" to remote "/root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/AnsiballZ_setup.py" <<< 32980 1727096590.91004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.91058: stdout chunk (state=3): >>><<< 32980 1727096590.91086: stderr chunk (state=3): >>><<< 32980 1727096590.91124: done transferring module to remote 32980 1727096590.91140: _low_level_execute_command(): starting 32980 1727096590.91143: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/ /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/AnsiballZ_setup.py && sleep 0' 32980 1727096590.91878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096590.91885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096590.91975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.92001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.92019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096590.92056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.92097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096590.94140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096590.94147: stdout chunk (state=3): >>><<< 32980 1727096590.94150: stderr chunk (state=3): >>><<< 32980 1727096590.94153: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096590.94155: _low_level_execute_command(): starting 32980 1727096590.94157: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/AnsiballZ_setup.py && sleep 0' 32980 1727096590.95141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096590.95156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096590.95176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096590.95194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096590.95212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096590.95224: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096590.95252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096590.95351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096590.95369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096590.95390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096590.95462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096591.59279: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "03", "second": "11", "epoch": "1727096591", "epoch_int": "1727096591", "date": "2024-09-23", "time": "09:03:11", "iso8601_micro": "2024-09-23T13:03:11.235677Z", "iso8601": "2024-09-23T13:03:11Z", "iso8601_basic": "20240923T090311235677", "iso8601_basic_short": "20240923T090311", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_isc<<< 32980 1727096591.59362: stdout chunk (state=3): >>>si_iqn": "", "ansible_loadavg": {"1m": 0.740234375, "5m": 0.59033203125, "15m": 0.35302734375}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 733, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261790765056, "block_size": 4096, "block_total": 65519099, "block_available": 63913761, "block_used": 1605338, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32980 1727096591.61392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096591.61463: stderr chunk (state=3): >>><<< 32980 1727096591.61478: stdout chunk (state=3): >>><<< 32980 1727096591.61483: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "03", "second": "11", "epoch": "1727096591", "epoch_int": "1727096591", "date": "2024-09-23", "time": "09:03:11", "iso8601_micro": "2024-09-23T13:03:11.235677Z", "iso8601": "2024-09-23T13:03:11Z", "iso8601_basic": "20240923T090311235677", "iso8601_basic_short": "20240923T090311", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.740234375, "5m": 0.59033203125, "15m": 0.35302734375}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 733, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261790765056, "block_size": 4096, "block_total": 65519099, "block_available": 63913761, "block_used": 1605338, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096591.61764: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096591.61797: _low_level_execute_command(): starting 32980 1727096591.61864: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096590.812238-33133-248709685111568/ > /dev/null 2>&1 && sleep 0' 32980 1727096591.62555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096591.62558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096591.62560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096591.62562: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096591.62565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096591.62639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096591.62642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096591.62644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096591.62692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096591.64590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096591.64611: stderr chunk (state=3): >>><<< 32980 1727096591.64626: stdout chunk (state=3): >>><<< 32980 1727096591.64645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096591.64659: handler run complete 32980 1727096591.64807: variable 'ansible_facts' from source: unknown 32980 1727096591.64940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.65278: variable 'ansible_facts' from source: unknown 32980 1727096591.65377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.65516: attempt loop complete, returning result 32980 1727096591.65531: _execute() done 32980 1727096591.65537: dumping result to json 32980 1727096591.65572: done dumping result, returning 32980 1727096591.65586: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0afff68d-5257-457d-ef33-00000000010b] 32980 1727096591.65633: sending task result for task 0afff68d-5257-457d-ef33-00000000010b ok: [managed_node2] 32980 1727096591.66551: no more pending results, returning what we have 32980 1727096591.66554: results queue empty 32980 1727096591.66554: checking for any_errors_fatal 32980 1727096591.66556: done checking for any_errors_fatal 32980 1727096591.66556: checking for max_fail_percentage 32980 1727096591.66558: done checking for max_fail_percentage 32980 1727096591.66559: checking to see if all hosts have failed and the running result is not ok 32980 1727096591.66559: done checking to see if all hosts have failed 32980 1727096591.66560: getting the remaining hosts for this loop 32980 1727096591.66561: done getting the remaining hosts for this loop 32980 1727096591.66565: getting the next task for host managed_node2 32980 1727096591.66580: done getting next task for host managed_node2 32980 1727096591.66582: ^ task is: TASK: meta (flush_handlers) 32980 1727096591.66584: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096591.66593: getting variables 32980 1727096591.66595: in VariableManager get_vars() 32980 1727096591.66626: Calling all_inventory to load vars for managed_node2 32980 1727096591.66628: Calling groups_inventory to load vars for managed_node2 32980 1727096591.66630: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096591.66636: done sending task result for task 0afff68d-5257-457d-ef33-00000000010b 32980 1727096591.66638: WORKER PROCESS EXITING 32980 1727096591.66647: Calling all_plugins_play to load vars for managed_node2 32980 1727096591.66650: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096591.66653: Calling groups_plugins_play to load vars for managed_node2 32980 1727096591.67019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.67227: done with get_vars() 32980 1727096591.67240: done getting variables 32980 1727096591.67316: in VariableManager get_vars() 32980 1727096591.67331: Calling all_inventory to load vars for managed_node2 32980 1727096591.67333: Calling groups_inventory to load vars for managed_node2 32980 1727096591.67335: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096591.67340: Calling all_plugins_play to load vars for managed_node2 32980 1727096591.67342: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096591.67344: Calling groups_plugins_play to load vars for managed_node2 32980 1727096591.67509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.67704: done with get_vars() 32980 1727096591.67717: done queuing things up, now waiting for results queue to drain 32980 1727096591.67723: results queue empty 32980 1727096591.67724: checking for any_errors_fatal 32980 1727096591.67727: done checking for any_errors_fatal 32980 1727096591.67731: checking for max_fail_percentage 32980 1727096591.67733: done checking for max_fail_percentage 32980 1727096591.67733: checking to see if all hosts have failed and the running result is not ok 32980 1727096591.67734: done checking to see if all hosts have failed 32980 1727096591.67735: getting the remaining hosts for this loop 32980 1727096591.67736: done getting the remaining hosts for this loop 32980 1727096591.67738: getting the next task for host managed_node2 32980 1727096591.67742: done getting next task for host managed_node2 32980 1727096591.67744: ^ task is: TASK: Include the task 'show_interfaces.yml' 32980 1727096591.67746: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096591.67748: getting variables 32980 1727096591.67749: in VariableManager get_vars() 32980 1727096591.67761: Calling all_inventory to load vars for managed_node2 32980 1727096591.67763: Calling groups_inventory to load vars for managed_node2 32980 1727096591.67765: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096591.67775: Calling all_plugins_play to load vars for managed_node2 32980 1727096591.67778: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096591.67781: Calling groups_plugins_play to load vars for managed_node2 32980 1727096591.67916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.68092: done with get_vars() 32980 1727096591.68100: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:10 Monday 23 September 2024 09:03:11 -0400 (0:00:00.920) 0:00:03.608 ****** 32980 1727096591.68166: entering _queue_task() for managed_node2/include_tasks 32980 1727096591.68610: worker is 1 (out of 1 available) 32980 1727096591.68622: exiting _queue_task() for managed_node2/include_tasks 32980 1727096591.68633: done queuing things up, now waiting for results queue to drain 32980 1727096591.68635: waiting for pending results... 32980 1727096591.68817: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 32980 1727096591.68977: in run() - task 0afff68d-5257-457d-ef33-00000000000b 32980 1727096591.68980: variable 'ansible_search_path' from source: unknown 32980 1727096591.68995: calling self._execute() 32980 1727096591.69091: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096591.69107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096591.69122: variable 'omit' from source: magic vars 32980 1727096591.69582: variable 'ansible_distribution_major_version' from source: facts 32980 1727096591.69586: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096591.69589: _execute() done 32980 1727096591.69591: dumping result to json 32980 1727096591.69594: done dumping result, returning 32980 1727096591.69596: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-457d-ef33-00000000000b] 32980 1727096591.69605: sending task result for task 0afff68d-5257-457d-ef33-00000000000b 32980 1727096591.69838: no more pending results, returning what we have 32980 1727096591.69843: in VariableManager get_vars() 32980 1727096591.69897: Calling all_inventory to load vars for managed_node2 32980 1727096591.69902: Calling groups_inventory to load vars for managed_node2 32980 1727096591.69905: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096591.69919: Calling all_plugins_play to load vars for managed_node2 32980 1727096591.69923: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096591.69927: Calling groups_plugins_play to load vars for managed_node2 32980 1727096591.70299: done sending task result for task 0afff68d-5257-457d-ef33-00000000000b 32980 1727096591.70303: WORKER PROCESS EXITING 32980 1727096591.70326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.70569: done with get_vars() 32980 1727096591.70579: variable 'ansible_search_path' from source: unknown 32980 1727096591.70593: we have included files to process 32980 1727096591.70594: generating all_blocks data 32980 1727096591.70595: done generating all_blocks data 32980 1727096591.70596: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096591.70597: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096591.70600: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096591.70763: in VariableManager get_vars() 32980 1727096591.70790: done with get_vars() 32980 1727096591.70906: done processing included file 32980 1727096591.70908: iterating over new_blocks loaded from include file 32980 1727096591.70909: in VariableManager get_vars() 32980 1727096591.70927: done with get_vars() 32980 1727096591.70928: filtering new block on tags 32980 1727096591.70946: done filtering new block on tags 32980 1727096591.70948: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 32980 1727096591.70959: extending task lists for all hosts with included blocks 32980 1727096591.73039: done extending task lists 32980 1727096591.73041: done processing included files 32980 1727096591.73042: results queue empty 32980 1727096591.73043: checking for any_errors_fatal 32980 1727096591.73044: done checking for any_errors_fatal 32980 1727096591.73045: checking for max_fail_percentage 32980 1727096591.73046: done checking for max_fail_percentage 32980 1727096591.73047: checking to see if all hosts have failed and the running result is not ok 32980 1727096591.73047: done checking to see if all hosts have failed 32980 1727096591.73049: getting the remaining hosts for this loop 32980 1727096591.73050: done getting the remaining hosts for this loop 32980 1727096591.73053: getting the next task for host managed_node2 32980 1727096591.73057: done getting next task for host managed_node2 32980 1727096591.73059: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32980 1727096591.73061: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096591.73064: getting variables 32980 1727096591.73065: in VariableManager get_vars() 32980 1727096591.73086: Calling all_inventory to load vars for managed_node2 32980 1727096591.73089: Calling groups_inventory to load vars for managed_node2 32980 1727096591.73091: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096591.73097: Calling all_plugins_play to load vars for managed_node2 32980 1727096591.73099: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096591.73102: Calling groups_plugins_play to load vars for managed_node2 32980 1727096591.73266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.73477: done with get_vars() 32980 1727096591.73488: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:03:11 -0400 (0:00:00.053) 0:00:03.662 ****** 32980 1727096591.73561: entering _queue_task() for managed_node2/include_tasks 32980 1727096591.73976: worker is 1 (out of 1 available) 32980 1727096591.73987: exiting _queue_task() for managed_node2/include_tasks 32980 1727096591.73998: done queuing things up, now waiting for results queue to drain 32980 1727096591.73999: waiting for pending results... 32980 1727096591.74192: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 32980 1727096591.74330: in run() - task 0afff68d-5257-457d-ef33-000000000120 32980 1727096591.74339: variable 'ansible_search_path' from source: unknown 32980 1727096591.74342: variable 'ansible_search_path' from source: unknown 32980 1727096591.74383: calling self._execute() 32980 1727096591.74551: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096591.74555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096591.74558: variable 'omit' from source: magic vars 32980 1727096591.74933: variable 'ansible_distribution_major_version' from source: facts 32980 1727096591.74951: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096591.74962: _execute() done 32980 1727096591.74972: dumping result to json 32980 1727096591.74989: done dumping result, returning 32980 1727096591.75004: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-457d-ef33-000000000120] 32980 1727096591.75014: sending task result for task 0afff68d-5257-457d-ef33-000000000120 32980 1727096591.75142: no more pending results, returning what we have 32980 1727096591.75147: in VariableManager get_vars() 32980 1727096591.75200: Calling all_inventory to load vars for managed_node2 32980 1727096591.75204: Calling groups_inventory to load vars for managed_node2 32980 1727096591.75206: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096591.75220: Calling all_plugins_play to load vars for managed_node2 32980 1727096591.75223: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096591.75227: Calling groups_plugins_play to load vars for managed_node2 32980 1727096591.75716: done sending task result for task 0afff68d-5257-457d-ef33-000000000120 32980 1727096591.75719: WORKER PROCESS EXITING 32980 1727096591.75742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.75948: done with get_vars() 32980 1727096591.75956: variable 'ansible_search_path' from source: unknown 32980 1727096591.75957: variable 'ansible_search_path' from source: unknown 32980 1727096591.75999: we have included files to process 32980 1727096591.76000: generating all_blocks data 32980 1727096591.76001: done generating all_blocks data 32980 1727096591.76003: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096591.76004: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096591.76006: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096591.76339: done processing included file 32980 1727096591.76341: iterating over new_blocks loaded from include file 32980 1727096591.76343: in VariableManager get_vars() 32980 1727096591.76366: done with get_vars() 32980 1727096591.76371: filtering new block on tags 32980 1727096591.76391: done filtering new block on tags 32980 1727096591.76393: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 32980 1727096591.76398: extending task lists for all hosts with included blocks 32980 1727096591.76505: done extending task lists 32980 1727096591.76507: done processing included files 32980 1727096591.76507: results queue empty 32980 1727096591.76508: checking for any_errors_fatal 32980 1727096591.76511: done checking for any_errors_fatal 32980 1727096591.76512: checking for max_fail_percentage 32980 1727096591.76513: done checking for max_fail_percentage 32980 1727096591.76514: checking to see if all hosts have failed and the running result is not ok 32980 1727096591.76514: done checking to see if all hosts have failed 32980 1727096591.76515: getting the remaining hosts for this loop 32980 1727096591.76516: done getting the remaining hosts for this loop 32980 1727096591.76519: getting the next task for host managed_node2 32980 1727096591.76523: done getting next task for host managed_node2 32980 1727096591.76525: ^ task is: TASK: Gather current interface info 32980 1727096591.76528: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096591.76530: getting variables 32980 1727096591.76531: in VariableManager get_vars() 32980 1727096591.76544: Calling all_inventory to load vars for managed_node2 32980 1727096591.76546: Calling groups_inventory to load vars for managed_node2 32980 1727096591.76548: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096591.76553: Calling all_plugins_play to load vars for managed_node2 32980 1727096591.76556: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096591.76558: Calling groups_plugins_play to load vars for managed_node2 32980 1727096591.76709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096591.76932: done with get_vars() 32980 1727096591.76941: done getting variables 32980 1727096591.76987: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:03:11 -0400 (0:00:00.034) 0:00:03.696 ****** 32980 1727096591.77022: entering _queue_task() for managed_node2/command 32980 1727096591.77326: worker is 1 (out of 1 available) 32980 1727096591.77452: exiting _queue_task() for managed_node2/command 32980 1727096591.77461: done queuing things up, now waiting for results queue to drain 32980 1727096591.77463: waiting for pending results... 32980 1727096591.77617: running TaskExecutor() for managed_node2/TASK: Gather current interface info 32980 1727096591.77735: in run() - task 0afff68d-5257-457d-ef33-0000000001ff 32980 1727096591.77754: variable 'ansible_search_path' from source: unknown 32980 1727096591.77762: variable 'ansible_search_path' from source: unknown 32980 1727096591.77813: calling self._execute() 32980 1727096591.77911: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096591.77975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096591.77980: variable 'omit' from source: magic vars 32980 1727096591.78343: variable 'ansible_distribution_major_version' from source: facts 32980 1727096591.78362: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096591.78379: variable 'omit' from source: magic vars 32980 1727096591.78430: variable 'omit' from source: magic vars 32980 1727096591.78477: variable 'omit' from source: magic vars 32980 1727096591.78524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096591.78645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096591.78649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096591.78654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096591.78656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096591.78682: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096591.78691: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096591.78699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096591.78815: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096591.78827: Set connection var ansible_timeout to 10 32980 1727096591.78834: Set connection var ansible_shell_type to sh 32980 1727096591.78840: Set connection var ansible_connection to ssh 32980 1727096591.78852: Set connection var ansible_shell_executable to /bin/sh 32980 1727096591.78871: Set connection var ansible_pipelining to False 32980 1727096591.78976: variable 'ansible_shell_executable' from source: unknown 32980 1727096591.78980: variable 'ansible_connection' from source: unknown 32980 1727096591.78983: variable 'ansible_module_compression' from source: unknown 32980 1727096591.78985: variable 'ansible_shell_type' from source: unknown 32980 1727096591.78987: variable 'ansible_shell_executable' from source: unknown 32980 1727096591.78989: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096591.78991: variable 'ansible_pipelining' from source: unknown 32980 1727096591.78993: variable 'ansible_timeout' from source: unknown 32980 1727096591.78995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096591.79100: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096591.79113: variable 'omit' from source: magic vars 32980 1727096591.79116: starting attempt loop 32980 1727096591.79119: running the handler 32980 1727096591.79132: _low_level_execute_command(): starting 32980 1727096591.79139: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096591.79976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096591.79999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096591.80019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096591.80041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096591.80119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096591.81841: stdout chunk (state=3): >>>/root <<< 32980 1727096591.82000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096591.82004: stdout chunk (state=3): >>><<< 32980 1727096591.82007: stderr chunk (state=3): >>><<< 32980 1727096591.82026: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096591.82046: _low_level_execute_command(): starting 32980 1727096591.82058: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135 `" && echo ansible-tmp-1727096591.8203344-33189-188831131090135="` echo /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135 `" ) && sleep 0' 32980 1727096591.82838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096591.82898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096591.82943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096591.84938: stdout chunk (state=3): >>>ansible-tmp-1727096591.8203344-33189-188831131090135=/root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135 <<< 32980 1727096591.85091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096591.85095: stdout chunk (state=3): >>><<< 32980 1727096591.85098: stderr chunk (state=3): >>><<< 32980 1727096591.85115: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096591.8203344-33189-188831131090135=/root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096591.85152: variable 'ansible_module_compression' from source: unknown 32980 1727096591.85279: ANSIBALLZ: Using generic lock for ansible.legacy.command 32980 1727096591.85282: ANSIBALLZ: Acquiring lock 32980 1727096591.85285: ANSIBALLZ: Lock acquired: 140258569802416 32980 1727096591.85287: ANSIBALLZ: Creating module 32980 1727096592.00702: ANSIBALLZ: Writing module into payload 32980 1727096592.00783: ANSIBALLZ: Writing module 32980 1727096592.00786: ANSIBALLZ: Renaming module 32980 1727096592.00788: ANSIBALLZ: Done creating module 32980 1727096592.00790: variable 'ansible_facts' from source: unknown 32980 1727096592.00881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/AnsiballZ_command.py 32980 1727096592.01064: Sending initial data 32980 1727096592.01069: Sent initial data (156 bytes) 32980 1727096592.02239: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096592.02412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.02455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.02621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.04312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 32980 1727096592.04321: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096592.04394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096592.04398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpjwehgblo /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/AnsiballZ_command.py <<< 32980 1727096592.04401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/AnsiballZ_command.py" <<< 32980 1727096592.04433: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpjwehgblo" to remote "/root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/AnsiballZ_command.py" <<< 32980 1727096592.05180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.05187: stdout chunk (state=3): >>><<< 32980 1727096592.05292: stderr chunk (state=3): >>><<< 32980 1727096592.05295: done transferring module to remote 32980 1727096592.05298: _low_level_execute_command(): starting 32980 1727096592.05300: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/ /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/AnsiballZ_command.py && sleep 0' 32980 1727096592.05973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096592.05988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096592.06005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096592.06066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.06125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096592.06149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.06172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.06329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.08194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.08211: stderr chunk (state=3): >>><<< 32980 1727096592.08220: stdout chunk (state=3): >>><<< 32980 1727096592.08244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096592.08252: _low_level_execute_command(): starting 32980 1727096592.08263: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/AnsiballZ_command.py && sleep 0' 32980 1727096592.08863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096592.08884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096592.08902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096592.08926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096592.08945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096592.08958: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096592.08984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.09033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.09092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096592.09117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.09151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.09266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.25264: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:03:12.248123", "end": "2024-09-23 09:03:12.251569", "delta": "0:00:00.003446", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096592.26907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096592.26924: stdout chunk (state=3): >>><<< 32980 1727096592.26943: stderr chunk (state=3): >>><<< 32980 1727096592.26973: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:03:12.248123", "end": "2024-09-23 09:03:12.251569", "delta": "0:00:00.003446", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096592.27094: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096592.27098: _low_level_execute_command(): starting 32980 1727096592.27101: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096591.8203344-33189-188831131090135/ > /dev/null 2>&1 && sleep 0' 32980 1727096592.28730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096592.28743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096592.28787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.28804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096592.28909: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.29079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096592.29110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.29123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.29202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.31394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.31398: stdout chunk (state=3): >>><<< 32980 1727096592.31400: stderr chunk (state=3): >>><<< 32980 1727096592.31403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096592.31405: handler run complete 32980 1727096592.31407: Evaluated conditional (False): False 32980 1727096592.31409: attempt loop complete, returning result 32980 1727096592.31411: _execute() done 32980 1727096592.31413: dumping result to json 32980 1727096592.31415: done dumping result, returning 32980 1727096592.31417: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-457d-ef33-0000000001ff] 32980 1727096592.31419: sending task result for task 0afff68d-5257-457d-ef33-0000000001ff 32980 1727096592.31507: done sending task result for task 0afff68d-5257-457d-ef33-0000000001ff 32980 1727096592.31510: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003446", "end": "2024-09-23 09:03:12.251569", "rc": 0, "start": "2024-09-23 09:03:12.248123" } STDOUT: bonding_masters eth0 lo 32980 1727096592.31601: no more pending results, returning what we have 32980 1727096592.31606: results queue empty 32980 1727096592.31607: checking for any_errors_fatal 32980 1727096592.31609: done checking for any_errors_fatal 32980 1727096592.31610: checking for max_fail_percentage 32980 1727096592.31611: done checking for max_fail_percentage 32980 1727096592.31612: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.31613: done checking to see if all hosts have failed 32980 1727096592.31613: getting the remaining hosts for this loop 32980 1727096592.31615: done getting the remaining hosts for this loop 32980 1727096592.31618: getting the next task for host managed_node2 32980 1727096592.31653: done getting next task for host managed_node2 32980 1727096592.31656: ^ task is: TASK: Set current_interfaces 32980 1727096592.31660: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.31664: getting variables 32980 1727096592.31666: in VariableManager get_vars() 32980 1727096592.31713: Calling all_inventory to load vars for managed_node2 32980 1727096592.31716: Calling groups_inventory to load vars for managed_node2 32980 1727096592.31719: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.32179: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.32184: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.32188: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.32752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.33216: done with get_vars() 32980 1727096592.33314: done getting variables 32980 1727096592.33557: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:03:12 -0400 (0:00:00.565) 0:00:04.262 ****** 32980 1727096592.33591: entering _queue_task() for managed_node2/set_fact 32980 1727096592.34463: worker is 1 (out of 1 available) 32980 1727096592.34477: exiting _queue_task() for managed_node2/set_fact 32980 1727096592.34489: done queuing things up, now waiting for results queue to drain 32980 1727096592.34565: waiting for pending results... 32980 1727096592.35337: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 32980 1727096592.35343: in run() - task 0afff68d-5257-457d-ef33-000000000200 32980 1727096592.35346: variable 'ansible_search_path' from source: unknown 32980 1727096592.35349: variable 'ansible_search_path' from source: unknown 32980 1727096592.35351: calling self._execute() 32980 1727096592.35538: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.35542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.35545: variable 'omit' from source: magic vars 32980 1727096592.36283: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.36286: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.36289: variable 'omit' from source: magic vars 32980 1727096592.36292: variable 'omit' from source: magic vars 32980 1727096592.36398: variable '_current_interfaces' from source: set_fact 32980 1727096592.36593: variable 'omit' from source: magic vars 32980 1727096592.36652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096592.36714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096592.36745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096592.36771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.36953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.36957: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096592.36959: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.36962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.37063: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096592.37081: Set connection var ansible_timeout to 10 32980 1727096592.37088: Set connection var ansible_shell_type to sh 32980 1727096592.37094: Set connection var ansible_connection to ssh 32980 1727096592.37105: Set connection var ansible_shell_executable to /bin/sh 32980 1727096592.37117: Set connection var ansible_pipelining to False 32980 1727096592.37160: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.37171: variable 'ansible_connection' from source: unknown 32980 1727096592.37182: variable 'ansible_module_compression' from source: unknown 32980 1727096592.37188: variable 'ansible_shell_type' from source: unknown 32980 1727096592.37299: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.37302: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.37304: variable 'ansible_pipelining' from source: unknown 32980 1727096592.37306: variable 'ansible_timeout' from source: unknown 32980 1727096592.37308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.37519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096592.37537: variable 'omit' from source: magic vars 32980 1727096592.37547: starting attempt loop 32980 1727096592.37553: running the handler 32980 1727096592.37571: handler run complete 32980 1727096592.37862: attempt loop complete, returning result 32980 1727096592.37869: _execute() done 32980 1727096592.37874: dumping result to json 32980 1727096592.37877: done dumping result, returning 32980 1727096592.37881: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-457d-ef33-000000000200] 32980 1727096592.37885: sending task result for task 0afff68d-5257-457d-ef33-000000000200 32980 1727096592.38275: done sending task result for task 0afff68d-5257-457d-ef33-000000000200 32980 1727096592.38282: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 32980 1727096592.38351: no more pending results, returning what we have 32980 1727096592.38415: results queue empty 32980 1727096592.38417: checking for any_errors_fatal 32980 1727096592.38424: done checking for any_errors_fatal 32980 1727096592.38424: checking for max_fail_percentage 32980 1727096592.38426: done checking for max_fail_percentage 32980 1727096592.38426: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.38427: done checking to see if all hosts have failed 32980 1727096592.38427: getting the remaining hosts for this loop 32980 1727096592.38428: done getting the remaining hosts for this loop 32980 1727096592.38431: getting the next task for host managed_node2 32980 1727096592.38439: done getting next task for host managed_node2 32980 1727096592.38441: ^ task is: TASK: Show current_interfaces 32980 1727096592.38448: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.38452: getting variables 32980 1727096592.38454: in VariableManager get_vars() 32980 1727096592.38509: Calling all_inventory to load vars for managed_node2 32980 1727096592.38514: Calling groups_inventory to load vars for managed_node2 32980 1727096592.38517: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.38533: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.38539: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.38543: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.38696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.38811: done with get_vars() 32980 1727096592.38818: done getting variables 32980 1727096592.38908: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:03:12 -0400 (0:00:00.053) 0:00:04.316 ****** 32980 1727096592.38938: entering _queue_task() for managed_node2/debug 32980 1727096592.38940: Creating lock for debug 32980 1727096592.39212: worker is 1 (out of 1 available) 32980 1727096592.39222: exiting _queue_task() for managed_node2/debug 32980 1727096592.39232: done queuing things up, now waiting for results queue to drain 32980 1727096592.39234: waiting for pending results... 32980 1727096592.39479: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 32980 1727096592.39578: in run() - task 0afff68d-5257-457d-ef33-000000000121 32980 1727096592.39599: variable 'ansible_search_path' from source: unknown 32980 1727096592.39606: variable 'ansible_search_path' from source: unknown 32980 1727096592.39644: calling self._execute() 32980 1727096592.39774: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.39778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.39781: variable 'omit' from source: magic vars 32980 1727096592.40324: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.40342: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.40354: variable 'omit' from source: magic vars 32980 1727096592.40574: variable 'omit' from source: magic vars 32980 1727096592.40655: variable 'current_interfaces' from source: set_fact 32980 1727096592.40698: variable 'omit' from source: magic vars 32980 1727096592.40748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096592.40792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096592.40828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096592.40848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.40861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.40908: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096592.40917: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.40937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.41052: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096592.41127: Set connection var ansible_timeout to 10 32980 1727096592.41130: Set connection var ansible_shell_type to sh 32980 1727096592.41132: Set connection var ansible_connection to ssh 32980 1727096592.41133: Set connection var ansible_shell_executable to /bin/sh 32980 1727096592.41136: Set connection var ansible_pipelining to False 32980 1727096592.41137: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.41139: variable 'ansible_connection' from source: unknown 32980 1727096592.41141: variable 'ansible_module_compression' from source: unknown 32980 1727096592.41142: variable 'ansible_shell_type' from source: unknown 32980 1727096592.41144: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.41146: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.41148: variable 'ansible_pipelining' from source: unknown 32980 1727096592.41159: variable 'ansible_timeout' from source: unknown 32980 1727096592.41170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.41323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096592.41340: variable 'omit' from source: magic vars 32980 1727096592.41351: starting attempt loop 32980 1727096592.41357: running the handler 32980 1727096592.41490: handler run complete 32980 1727096592.41493: attempt loop complete, returning result 32980 1727096592.41495: _execute() done 32980 1727096592.41497: dumping result to json 32980 1727096592.41499: done dumping result, returning 32980 1727096592.41501: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-457d-ef33-000000000121] 32980 1727096592.41503: sending task result for task 0afff68d-5257-457d-ef33-000000000121 32980 1727096592.41569: done sending task result for task 0afff68d-5257-457d-ef33-000000000121 32980 1727096592.41572: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 32980 1727096592.41643: no more pending results, returning what we have 32980 1727096592.41646: results queue empty 32980 1727096592.41648: checking for any_errors_fatal 32980 1727096592.41652: done checking for any_errors_fatal 32980 1727096592.41652: checking for max_fail_percentage 32980 1727096592.41654: done checking for max_fail_percentage 32980 1727096592.41654: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.41655: done checking to see if all hosts have failed 32980 1727096592.41656: getting the remaining hosts for this loop 32980 1727096592.41658: done getting the remaining hosts for this loop 32980 1727096592.41661: getting the next task for host managed_node2 32980 1727096592.41671: done getting next task for host managed_node2 32980 1727096592.41675: ^ task is: TASK: Include the task 'manage_test_interface.yml' 32980 1727096592.41677: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.41680: getting variables 32980 1727096592.41682: in VariableManager get_vars() 32980 1727096592.41724: Calling all_inventory to load vars for managed_node2 32980 1727096592.41727: Calling groups_inventory to load vars for managed_node2 32980 1727096592.41730: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.41741: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.41744: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.41748: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.42041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.42205: done with get_vars() 32980 1727096592.42213: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:12 Monday 23 September 2024 09:03:12 -0400 (0:00:00.033) 0:00:04.349 ****** 32980 1727096592.42281: entering _queue_task() for managed_node2/include_tasks 32980 1727096592.42486: worker is 1 (out of 1 available) 32980 1727096592.42613: exiting _queue_task() for managed_node2/include_tasks 32980 1727096592.42624: done queuing things up, now waiting for results queue to drain 32980 1727096592.42626: waiting for pending results... 32980 1727096592.42885: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 32980 1727096592.42891: in run() - task 0afff68d-5257-457d-ef33-00000000000c 32980 1727096592.42894: variable 'ansible_search_path' from source: unknown 32980 1727096592.42897: calling self._execute() 32980 1727096592.43112: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.43129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.43237: variable 'omit' from source: magic vars 32980 1727096592.43695: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.43713: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.43723: _execute() done 32980 1727096592.43730: dumping result to json 32980 1727096592.43738: done dumping result, returning 32980 1727096592.43749: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-457d-ef33-00000000000c] 32980 1727096592.43759: sending task result for task 0afff68d-5257-457d-ef33-00000000000c 32980 1727096592.44006: no more pending results, returning what we have 32980 1727096592.44011: in VariableManager get_vars() 32980 1727096592.44065: Calling all_inventory to load vars for managed_node2 32980 1727096592.44070: Calling groups_inventory to load vars for managed_node2 32980 1727096592.44075: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.44090: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.44093: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.44096: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.44594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.44901: done with get_vars() 32980 1727096592.44909: variable 'ansible_search_path' from source: unknown 32980 1727096592.44963: we have included files to process 32980 1727096592.44968: generating all_blocks data 32980 1727096592.44970: done generating all_blocks data 32980 1727096592.44973: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32980 1727096592.44975: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32980 1727096592.44978: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32980 1727096592.44989: done sending task result for task 0afff68d-5257-457d-ef33-00000000000c 32980 1727096592.44993: WORKER PROCESS EXITING 32980 1727096592.45533: in VariableManager get_vars() 32980 1727096592.45556: done with get_vars() 32980 1727096592.45981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 32980 1727096592.46921: done processing included file 32980 1727096592.46923: iterating over new_blocks loaded from include file 32980 1727096592.46924: in VariableManager get_vars() 32980 1727096592.46943: done with get_vars() 32980 1727096592.46945: filtering new block on tags 32980 1727096592.47040: done filtering new block on tags 32980 1727096592.47043: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 32980 1727096592.47049: extending task lists for all hosts with included blocks 32980 1727096592.49039: done extending task lists 32980 1727096592.49041: done processing included files 32980 1727096592.49042: results queue empty 32980 1727096592.49042: checking for any_errors_fatal 32980 1727096592.49046: done checking for any_errors_fatal 32980 1727096592.49046: checking for max_fail_percentage 32980 1727096592.49048: done checking for max_fail_percentage 32980 1727096592.49049: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.49050: done checking to see if all hosts have failed 32980 1727096592.49051: getting the remaining hosts for this loop 32980 1727096592.49052: done getting the remaining hosts for this loop 32980 1727096592.49054: getting the next task for host managed_node2 32980 1727096592.49058: done getting next task for host managed_node2 32980 1727096592.49060: ^ task is: TASK: Ensure state in ["present", "absent"] 32980 1727096592.49063: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.49065: getting variables 32980 1727096592.49066: in VariableManager get_vars() 32980 1727096592.49086: Calling all_inventory to load vars for managed_node2 32980 1727096592.49089: Calling groups_inventory to load vars for managed_node2 32980 1727096592.49091: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.49097: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.49099: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.49101: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.49460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.50066: done with get_vars() 32980 1727096592.50294: done getting variables 32980 1727096592.50364: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 09:03:12 -0400 (0:00:00.081) 0:00:04.430 ****** 32980 1727096592.50396: entering _queue_task() for managed_node2/fail 32980 1727096592.50398: Creating lock for fail 32980 1727096592.50852: worker is 1 (out of 1 available) 32980 1727096592.50863: exiting _queue_task() for managed_node2/fail 32980 1727096592.50878: done queuing things up, now waiting for results queue to drain 32980 1727096592.50879: waiting for pending results... 32980 1727096592.51133: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 32980 1727096592.51254: in run() - task 0afff68d-5257-457d-ef33-00000000021b 32980 1727096592.51281: variable 'ansible_search_path' from source: unknown 32980 1727096592.51295: variable 'ansible_search_path' from source: unknown 32980 1727096592.51324: calling self._execute() 32980 1727096592.51407: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.51411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.51426: variable 'omit' from source: magic vars 32980 1727096592.51711: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.51721: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.51820: variable 'state' from source: include params 32980 1727096592.51823: Evaluated conditional (state not in ["present", "absent"]): False 32980 1727096592.51828: when evaluation is False, skipping this task 32980 1727096592.51831: _execute() done 32980 1727096592.51833: dumping result to json 32980 1727096592.51836: done dumping result, returning 32980 1727096592.51842: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-457d-ef33-00000000021b] 32980 1727096592.51845: sending task result for task 0afff68d-5257-457d-ef33-00000000021b 32980 1727096592.51932: done sending task result for task 0afff68d-5257-457d-ef33-00000000021b 32980 1727096592.51934: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 32980 1727096592.51981: no more pending results, returning what we have 32980 1727096592.51985: results queue empty 32980 1727096592.51986: checking for any_errors_fatal 32980 1727096592.51987: done checking for any_errors_fatal 32980 1727096592.51988: checking for max_fail_percentage 32980 1727096592.51989: done checking for max_fail_percentage 32980 1727096592.51990: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.51991: done checking to see if all hosts have failed 32980 1727096592.51991: getting the remaining hosts for this loop 32980 1727096592.51993: done getting the remaining hosts for this loop 32980 1727096592.51996: getting the next task for host managed_node2 32980 1727096592.52003: done getting next task for host managed_node2 32980 1727096592.52005: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 32980 1727096592.52008: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.52011: getting variables 32980 1727096592.52013: in VariableManager get_vars() 32980 1727096592.52051: Calling all_inventory to load vars for managed_node2 32980 1727096592.52054: Calling groups_inventory to load vars for managed_node2 32980 1727096592.52056: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.52069: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.52072: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.52075: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.52213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.52334: done with get_vars() 32980 1727096592.52341: done getting variables 32980 1727096592.52388: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 09:03:12 -0400 (0:00:00.020) 0:00:04.450 ****** 32980 1727096592.52408: entering _queue_task() for managed_node2/fail 32980 1727096592.52600: worker is 1 (out of 1 available) 32980 1727096592.52613: exiting _queue_task() for managed_node2/fail 32980 1727096592.52624: done queuing things up, now waiting for results queue to drain 32980 1727096592.52625: waiting for pending results... 32980 1727096592.52777: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 32980 1727096592.52847: in run() - task 0afff68d-5257-457d-ef33-00000000021c 32980 1727096592.52861: variable 'ansible_search_path' from source: unknown 32980 1727096592.52865: variable 'ansible_search_path' from source: unknown 32980 1727096592.52892: calling self._execute() 32980 1727096592.52949: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.52952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.52971: variable 'omit' from source: magic vars 32980 1727096592.53357: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.53361: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.53488: variable 'type' from source: play vars 32980 1727096592.53501: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 32980 1727096592.53508: when evaluation is False, skipping this task 32980 1727096592.53516: _execute() done 32980 1727096592.53521: dumping result to json 32980 1727096592.53527: done dumping result, returning 32980 1727096592.53535: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-457d-ef33-00000000021c] 32980 1727096592.53543: sending task result for task 0afff68d-5257-457d-ef33-00000000021c skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 32980 1727096592.53684: no more pending results, returning what we have 32980 1727096592.53687: results queue empty 32980 1727096592.53688: checking for any_errors_fatal 32980 1727096592.53695: done checking for any_errors_fatal 32980 1727096592.53696: checking for max_fail_percentage 32980 1727096592.53698: done checking for max_fail_percentage 32980 1727096592.53698: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.53699: done checking to see if all hosts have failed 32980 1727096592.53700: getting the remaining hosts for this loop 32980 1727096592.53701: done getting the remaining hosts for this loop 32980 1727096592.53705: getting the next task for host managed_node2 32980 1727096592.53712: done getting next task for host managed_node2 32980 1727096592.53715: ^ task is: TASK: Include the task 'show_interfaces.yml' 32980 1727096592.53718: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.53722: getting variables 32980 1727096592.53723: in VariableManager get_vars() 32980 1727096592.53763: Calling all_inventory to load vars for managed_node2 32980 1727096592.53766: Calling groups_inventory to load vars for managed_node2 32980 1727096592.53771: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.53786: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.53789: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.53792: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.54213: done sending task result for task 0afff68d-5257-457d-ef33-00000000021c 32980 1727096592.54216: WORKER PROCESS EXITING 32980 1727096592.54226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.54348: done with get_vars() 32980 1727096592.54355: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 09:03:12 -0400 (0:00:00.020) 0:00:04.470 ****** 32980 1727096592.54421: entering _queue_task() for managed_node2/include_tasks 32980 1727096592.54611: worker is 1 (out of 1 available) 32980 1727096592.54623: exiting _queue_task() for managed_node2/include_tasks 32980 1727096592.54637: done queuing things up, now waiting for results queue to drain 32980 1727096592.54638: waiting for pending results... 32980 1727096592.54784: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 32980 1727096592.54845: in run() - task 0afff68d-5257-457d-ef33-00000000021d 32980 1727096592.54855: variable 'ansible_search_path' from source: unknown 32980 1727096592.54860: variable 'ansible_search_path' from source: unknown 32980 1727096592.54894: calling self._execute() 32980 1727096592.54949: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.54952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.54961: variable 'omit' from source: magic vars 32980 1727096592.55233: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.55243: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.55248: _execute() done 32980 1727096592.55251: dumping result to json 32980 1727096592.55254: done dumping result, returning 32980 1727096592.55261: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-457d-ef33-00000000021d] 32980 1727096592.55263: sending task result for task 0afff68d-5257-457d-ef33-00000000021d 32980 1727096592.55346: done sending task result for task 0afff68d-5257-457d-ef33-00000000021d 32980 1727096592.55349: WORKER PROCESS EXITING 32980 1727096592.55381: no more pending results, returning what we have 32980 1727096592.55385: in VariableManager get_vars() 32980 1727096592.55427: Calling all_inventory to load vars for managed_node2 32980 1727096592.55430: Calling groups_inventory to load vars for managed_node2 32980 1727096592.55432: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.55441: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.55444: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.55447: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.55574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.55688: done with get_vars() 32980 1727096592.55693: variable 'ansible_search_path' from source: unknown 32980 1727096592.55694: variable 'ansible_search_path' from source: unknown 32980 1727096592.55717: we have included files to process 32980 1727096592.55717: generating all_blocks data 32980 1727096592.55719: done generating all_blocks data 32980 1727096592.55721: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096592.55722: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096592.55723: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096592.55789: in VariableManager get_vars() 32980 1727096592.55806: done with get_vars() 32980 1727096592.55878: done processing included file 32980 1727096592.55880: iterating over new_blocks loaded from include file 32980 1727096592.55881: in VariableManager get_vars() 32980 1727096592.55892: done with get_vars() 32980 1727096592.55893: filtering new block on tags 32980 1727096592.55907: done filtering new block on tags 32980 1727096592.55908: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 32980 1727096592.55912: extending task lists for all hosts with included blocks 32980 1727096592.56198: done extending task lists 32980 1727096592.56200: done processing included files 32980 1727096592.56201: results queue empty 32980 1727096592.56202: checking for any_errors_fatal 32980 1727096592.56205: done checking for any_errors_fatal 32980 1727096592.56206: checking for max_fail_percentage 32980 1727096592.56207: done checking for max_fail_percentage 32980 1727096592.56208: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.56211: done checking to see if all hosts have failed 32980 1727096592.56212: getting the remaining hosts for this loop 32980 1727096592.56213: done getting the remaining hosts for this loop 32980 1727096592.56216: getting the next task for host managed_node2 32980 1727096592.56220: done getting next task for host managed_node2 32980 1727096592.56222: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32980 1727096592.56225: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.56227: getting variables 32980 1727096592.56228: in VariableManager get_vars() 32980 1727096592.56238: Calling all_inventory to load vars for managed_node2 32980 1727096592.56240: Calling groups_inventory to load vars for managed_node2 32980 1727096592.56242: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.56246: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.56249: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.56251: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.56375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.56556: done with get_vars() 32980 1727096592.56566: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:03:12 -0400 (0:00:00.022) 0:00:04.493 ****** 32980 1727096592.56638: entering _queue_task() for managed_node2/include_tasks 32980 1727096592.56907: worker is 1 (out of 1 available) 32980 1727096592.56919: exiting _queue_task() for managed_node2/include_tasks 32980 1727096592.56933: done queuing things up, now waiting for results queue to drain 32980 1727096592.56935: waiting for pending results... 32980 1727096592.57295: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 32980 1727096592.57299: in run() - task 0afff68d-5257-457d-ef33-000000000314 32980 1727096592.57373: variable 'ansible_search_path' from source: unknown 32980 1727096592.57377: variable 'ansible_search_path' from source: unknown 32980 1727096592.57380: calling self._execute() 32980 1727096592.57447: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.57459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.57477: variable 'omit' from source: magic vars 32980 1727096592.57815: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.57825: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.57841: _execute() done 32980 1727096592.57844: dumping result to json 32980 1727096592.57847: done dumping result, returning 32980 1727096592.57852: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-457d-ef33-000000000314] 32980 1727096592.57874: sending task result for task 0afff68d-5257-457d-ef33-000000000314 32980 1727096592.57971: no more pending results, returning what we have 32980 1727096592.57976: in VariableManager get_vars() 32980 1727096592.58019: Calling all_inventory to load vars for managed_node2 32980 1727096592.58021: Calling groups_inventory to load vars for managed_node2 32980 1727096592.58023: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.58036: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.58038: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.58042: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.58214: done sending task result for task 0afff68d-5257-457d-ef33-000000000314 32980 1727096592.58218: WORKER PROCESS EXITING 32980 1727096592.58228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.58345: done with get_vars() 32980 1727096592.58350: variable 'ansible_search_path' from source: unknown 32980 1727096592.58351: variable 'ansible_search_path' from source: unknown 32980 1727096592.58392: we have included files to process 32980 1727096592.58393: generating all_blocks data 32980 1727096592.58394: done generating all_blocks data 32980 1727096592.58395: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096592.58396: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096592.58397: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096592.58560: done processing included file 32980 1727096592.58561: iterating over new_blocks loaded from include file 32980 1727096592.58562: in VariableManager get_vars() 32980 1727096592.58578: done with get_vars() 32980 1727096592.58579: filtering new block on tags 32980 1727096592.58592: done filtering new block on tags 32980 1727096592.58594: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 32980 1727096592.58597: extending task lists for all hosts with included blocks 32980 1727096592.58681: done extending task lists 32980 1727096592.58682: done processing included files 32980 1727096592.58682: results queue empty 32980 1727096592.58683: checking for any_errors_fatal 32980 1727096592.58685: done checking for any_errors_fatal 32980 1727096592.58686: checking for max_fail_percentage 32980 1727096592.58686: done checking for max_fail_percentage 32980 1727096592.58687: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.58687: done checking to see if all hosts have failed 32980 1727096592.58688: getting the remaining hosts for this loop 32980 1727096592.58688: done getting the remaining hosts for this loop 32980 1727096592.58690: getting the next task for host managed_node2 32980 1727096592.58692: done getting next task for host managed_node2 32980 1727096592.58694: ^ task is: TASK: Gather current interface info 32980 1727096592.58697: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.58699: getting variables 32980 1727096592.58700: in VariableManager get_vars() 32980 1727096592.58708: Calling all_inventory to load vars for managed_node2 32980 1727096592.58710: Calling groups_inventory to load vars for managed_node2 32980 1727096592.58711: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.58714: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.58715: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.58717: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.58797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.58905: done with get_vars() 32980 1727096592.58912: done getting variables 32980 1727096592.58939: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:03:12 -0400 (0:00:00.023) 0:00:04.516 ****** 32980 1727096592.58959: entering _queue_task() for managed_node2/command 32980 1727096592.59160: worker is 1 (out of 1 available) 32980 1727096592.59174: exiting _queue_task() for managed_node2/command 32980 1727096592.59186: done queuing things up, now waiting for results queue to drain 32980 1727096592.59188: waiting for pending results... 32980 1727096592.59354: running TaskExecutor() for managed_node2/TASK: Gather current interface info 32980 1727096592.59497: in run() - task 0afff68d-5257-457d-ef33-00000000034b 32980 1727096592.59501: variable 'ansible_search_path' from source: unknown 32980 1727096592.59504: variable 'ansible_search_path' from source: unknown 32980 1727096592.59511: calling self._execute() 32980 1727096592.59578: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.59581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.59585: variable 'omit' from source: magic vars 32980 1727096592.60073: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.60076: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.60078: variable 'omit' from source: magic vars 32980 1727096592.60081: variable 'omit' from source: magic vars 32980 1727096592.60121: variable 'omit' from source: magic vars 32980 1727096592.60164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096592.60201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096592.60215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096592.60243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.60249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.60276: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096592.60279: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.60281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.60361: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096592.60364: Set connection var ansible_timeout to 10 32980 1727096592.60366: Set connection var ansible_shell_type to sh 32980 1727096592.60375: Set connection var ansible_connection to ssh 32980 1727096592.60382: Set connection var ansible_shell_executable to /bin/sh 32980 1727096592.60473: Set connection var ansible_pipelining to False 32980 1727096592.60476: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.60478: variable 'ansible_connection' from source: unknown 32980 1727096592.60481: variable 'ansible_module_compression' from source: unknown 32980 1727096592.60483: variable 'ansible_shell_type' from source: unknown 32980 1727096592.60485: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.60487: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.60489: variable 'ansible_pipelining' from source: unknown 32980 1727096592.60491: variable 'ansible_timeout' from source: unknown 32980 1727096592.60494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.60588: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096592.60605: variable 'omit' from source: magic vars 32980 1727096592.60614: starting attempt loop 32980 1727096592.60621: running the handler 32980 1727096592.60641: _low_level_execute_command(): starting 32980 1727096592.60655: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096592.61405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096592.61428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.61444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.61506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.61518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.61560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.63248: stdout chunk (state=3): >>>/root <<< 32980 1727096592.63403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.63406: stdout chunk (state=3): >>><<< 32980 1727096592.63409: stderr chunk (state=3): >>><<< 32980 1727096592.63428: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096592.63451: _low_level_execute_command(): starting 32980 1727096592.63529: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759 `" && echo ansible-tmp-1727096592.634366-33238-240261441849759="` echo /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759 `" ) && sleep 0' 32980 1727096592.64090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096592.64161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096592.64164: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.64202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096592.64218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.64240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.64309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.66261: stdout chunk (state=3): >>>ansible-tmp-1727096592.634366-33238-240261441849759=/root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759 <<< 32980 1727096592.66361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.66398: stderr chunk (state=3): >>><<< 32980 1727096592.66401: stdout chunk (state=3): >>><<< 32980 1727096592.66419: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096592.634366-33238-240261441849759=/root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096592.66447: variable 'ansible_module_compression' from source: unknown 32980 1727096592.66493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096592.66524: variable 'ansible_facts' from source: unknown 32980 1727096592.66633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/AnsiballZ_command.py 32980 1727096592.66775: Sending initial data 32980 1727096592.66781: Sent initial data (155 bytes) 32980 1727096592.67537: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.67589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096592.67592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.67594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.67634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.69285: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096592.69308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096592.69366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmplt5bkfjh /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/AnsiballZ_command.py <<< 32980 1727096592.69377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/AnsiballZ_command.py" <<< 32980 1727096592.69414: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmplt5bkfjh" to remote "/root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/AnsiballZ_command.py" <<< 32980 1727096592.70151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.70191: stderr chunk (state=3): >>><<< 32980 1727096592.70244: stdout chunk (state=3): >>><<< 32980 1727096592.70263: done transferring module to remote 32980 1727096592.70275: _low_level_execute_command(): starting 32980 1727096592.70287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/ /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/AnsiballZ_command.py && sleep 0' 32980 1727096592.70932: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096592.70935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096592.70938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096592.70940: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096592.70942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.71037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.71062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.72889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.72917: stderr chunk (state=3): >>><<< 32980 1727096592.72920: stdout chunk (state=3): >>><<< 32980 1727096592.72937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096592.72940: _low_level_execute_command(): starting 32980 1727096592.72944: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/AnsiballZ_command.py && sleep 0' 32980 1727096592.73408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096592.73411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.73413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096592.73415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096592.73417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.73472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096592.73479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.73481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.73516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.89320: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:03:12.888698", "end": "2024-09-23 09:03:12.892165", "delta": "0:00:00.003467", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096592.91433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096592.91437: stdout chunk (state=3): >>><<< 32980 1727096592.91439: stderr chunk (state=3): >>><<< 32980 1727096592.91442: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:03:12.888698", "end": "2024-09-23 09:03:12.892165", "delta": "0:00:00.003467", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096592.91445: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096592.91447: _low_level_execute_command(): starting 32980 1727096592.91450: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096592.634366-33238-240261441849759/ > /dev/null 2>&1 && sleep 0' 32980 1727096592.92529: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096592.92561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096592.92582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096592.92599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096592.92671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096592.92719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096592.92743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096592.92775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096592.92846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096592.94775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096592.94779: stdout chunk (state=3): >>><<< 32980 1727096592.94793: stderr chunk (state=3): >>><<< 32980 1727096592.95079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096592.95082: handler run complete 32980 1727096592.95085: Evaluated conditional (False): False 32980 1727096592.95087: attempt loop complete, returning result 32980 1727096592.95089: _execute() done 32980 1727096592.95091: dumping result to json 32980 1727096592.95094: done dumping result, returning 32980 1727096592.95096: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-457d-ef33-00000000034b] 32980 1727096592.95098: sending task result for task 0afff68d-5257-457d-ef33-00000000034b 32980 1727096592.95173: done sending task result for task 0afff68d-5257-457d-ef33-00000000034b 32980 1727096592.95176: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003467", "end": "2024-09-23 09:03:12.892165", "rc": 0, "start": "2024-09-23 09:03:12.888698" } STDOUT: bonding_masters eth0 lo 32980 1727096592.95590: no more pending results, returning what we have 32980 1727096592.95593: results queue empty 32980 1727096592.95594: checking for any_errors_fatal 32980 1727096592.95596: done checking for any_errors_fatal 32980 1727096592.95596: checking for max_fail_percentage 32980 1727096592.95598: done checking for max_fail_percentage 32980 1727096592.95599: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.95600: done checking to see if all hosts have failed 32980 1727096592.95600: getting the remaining hosts for this loop 32980 1727096592.95602: done getting the remaining hosts for this loop 32980 1727096592.95606: getting the next task for host managed_node2 32980 1727096592.95614: done getting next task for host managed_node2 32980 1727096592.95616: ^ task is: TASK: Set current_interfaces 32980 1727096592.95622: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.95627: getting variables 32980 1727096592.95628: in VariableManager get_vars() 32980 1727096592.95662: Calling all_inventory to load vars for managed_node2 32980 1727096592.95665: Calling groups_inventory to load vars for managed_node2 32980 1727096592.95773: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.95789: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.95792: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.95796: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.96260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096592.96608: done with get_vars() 32980 1727096592.96618: done getting variables 32980 1727096592.96687: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:03:12 -0400 (0:00:00.377) 0:00:04.893 ****** 32980 1727096592.96716: entering _queue_task() for managed_node2/set_fact 32980 1727096592.97028: worker is 1 (out of 1 available) 32980 1727096592.97042: exiting _queue_task() for managed_node2/set_fact 32980 1727096592.97054: done queuing things up, now waiting for results queue to drain 32980 1727096592.97055: waiting for pending results... 32980 1727096592.97306: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 32980 1727096592.97474: in run() - task 0afff68d-5257-457d-ef33-00000000034c 32980 1727096592.97478: variable 'ansible_search_path' from source: unknown 32980 1727096592.97481: variable 'ansible_search_path' from source: unknown 32980 1727096592.97493: calling self._execute() 32980 1727096592.97579: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.97590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.97607: variable 'omit' from source: magic vars 32980 1727096592.97983: variable 'ansible_distribution_major_version' from source: facts 32980 1727096592.98000: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096592.98010: variable 'omit' from source: magic vars 32980 1727096592.98175: variable 'omit' from source: magic vars 32980 1727096592.98184: variable '_current_interfaces' from source: set_fact 32980 1727096592.98246: variable 'omit' from source: magic vars 32980 1727096592.98300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096592.98338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096592.98361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096592.98389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.98409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096592.98439: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096592.98446: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.98453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.98590: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096592.98601: Set connection var ansible_timeout to 10 32980 1727096592.98615: Set connection var ansible_shell_type to sh 32980 1727096592.98624: Set connection var ansible_connection to ssh 32980 1727096592.98644: Set connection var ansible_shell_executable to /bin/sh 32980 1727096592.98719: Set connection var ansible_pipelining to False 32980 1727096592.98840: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.98844: variable 'ansible_connection' from source: unknown 32980 1727096592.98846: variable 'ansible_module_compression' from source: unknown 32980 1727096592.98848: variable 'ansible_shell_type' from source: unknown 32980 1727096592.98850: variable 'ansible_shell_executable' from source: unknown 32980 1727096592.98852: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096592.98854: variable 'ansible_pipelining' from source: unknown 32980 1727096592.98856: variable 'ansible_timeout' from source: unknown 32980 1727096592.98858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096592.99154: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096592.99158: variable 'omit' from source: magic vars 32980 1727096592.99161: starting attempt loop 32980 1727096592.99163: running the handler 32980 1727096592.99170: handler run complete 32980 1727096592.99173: attempt loop complete, returning result 32980 1727096592.99175: _execute() done 32980 1727096592.99178: dumping result to json 32980 1727096592.99180: done dumping result, returning 32980 1727096592.99205: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-457d-ef33-00000000034c] 32980 1727096592.99264: sending task result for task 0afff68d-5257-457d-ef33-00000000034c ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 32980 1727096592.99427: no more pending results, returning what we have 32980 1727096592.99430: results queue empty 32980 1727096592.99431: checking for any_errors_fatal 32980 1727096592.99436: done checking for any_errors_fatal 32980 1727096592.99437: checking for max_fail_percentage 32980 1727096592.99439: done checking for max_fail_percentage 32980 1727096592.99439: checking to see if all hosts have failed and the running result is not ok 32980 1727096592.99440: done checking to see if all hosts have failed 32980 1727096592.99441: getting the remaining hosts for this loop 32980 1727096592.99442: done getting the remaining hosts for this loop 32980 1727096592.99446: getting the next task for host managed_node2 32980 1727096592.99456: done getting next task for host managed_node2 32980 1727096592.99459: ^ task is: TASK: Show current_interfaces 32980 1727096592.99464: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096592.99572: getting variables 32980 1727096592.99575: in VariableManager get_vars() 32980 1727096592.99614: Calling all_inventory to load vars for managed_node2 32980 1727096592.99617: Calling groups_inventory to load vars for managed_node2 32980 1727096592.99620: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096592.99630: Calling all_plugins_play to load vars for managed_node2 32980 1727096592.99632: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096592.99635: Calling groups_plugins_play to load vars for managed_node2 32980 1727096592.99884: done sending task result for task 0afff68d-5257-457d-ef33-00000000034c 32980 1727096592.99889: WORKER PROCESS EXITING 32980 1727096592.99912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096593.00193: done with get_vars() 32980 1727096593.00202: done getting variables 32980 1727096593.00266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:03:13 -0400 (0:00:00.036) 0:00:04.930 ****** 32980 1727096593.00337: entering _queue_task() for managed_node2/debug 32980 1727096593.00792: worker is 1 (out of 1 available) 32980 1727096593.00801: exiting _queue_task() for managed_node2/debug 32980 1727096593.00808: done queuing things up, now waiting for results queue to drain 32980 1727096593.00810: waiting for pending results... 32980 1727096593.00869: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 32980 1727096593.00964: in run() - task 0afff68d-5257-457d-ef33-000000000315 32980 1727096593.01037: variable 'ansible_search_path' from source: unknown 32980 1727096593.01040: variable 'ansible_search_path' from source: unknown 32980 1727096593.01043: calling self._execute() 32980 1727096593.01109: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.01122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.01141: variable 'omit' from source: magic vars 32980 1727096593.01534: variable 'ansible_distribution_major_version' from source: facts 32980 1727096593.01551: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096593.01562: variable 'omit' from source: magic vars 32980 1727096593.01617: variable 'omit' from source: magic vars 32980 1727096593.01803: variable 'current_interfaces' from source: set_fact 32980 1727096593.01806: variable 'omit' from source: magic vars 32980 1727096593.01809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096593.01845: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096593.01874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096593.01897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096593.01922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096593.01957: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096593.01966: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.01979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.02093: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096593.02105: Set connection var ansible_timeout to 10 32980 1727096593.02112: Set connection var ansible_shell_type to sh 32980 1727096593.02119: Set connection var ansible_connection to ssh 32980 1727096593.02139: Set connection var ansible_shell_executable to /bin/sh 32980 1727096593.02149: Set connection var ansible_pipelining to False 32980 1727096593.02239: variable 'ansible_shell_executable' from source: unknown 32980 1727096593.02242: variable 'ansible_connection' from source: unknown 32980 1727096593.02245: variable 'ansible_module_compression' from source: unknown 32980 1727096593.02247: variable 'ansible_shell_type' from source: unknown 32980 1727096593.02250: variable 'ansible_shell_executable' from source: unknown 32980 1727096593.02252: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.02255: variable 'ansible_pipelining' from source: unknown 32980 1727096593.02257: variable 'ansible_timeout' from source: unknown 32980 1727096593.02259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.02375: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096593.02410: variable 'omit' from source: magic vars 32980 1727096593.02427: starting attempt loop 32980 1727096593.02438: running the handler 32980 1727096593.02689: handler run complete 32980 1727096593.02692: attempt loop complete, returning result 32980 1727096593.02694: _execute() done 32980 1727096593.02696: dumping result to json 32980 1727096593.02698: done dumping result, returning 32980 1727096593.02700: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-457d-ef33-000000000315] 32980 1727096593.02702: sending task result for task 0afff68d-5257-457d-ef33-000000000315 32980 1727096593.02764: done sending task result for task 0afff68d-5257-457d-ef33-000000000315 32980 1727096593.02774: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 32980 1727096593.02819: no more pending results, returning what we have 32980 1727096593.02822: results queue empty 32980 1727096593.02823: checking for any_errors_fatal 32980 1727096593.02829: done checking for any_errors_fatal 32980 1727096593.02830: checking for max_fail_percentage 32980 1727096593.02831: done checking for max_fail_percentage 32980 1727096593.02832: checking to see if all hosts have failed and the running result is not ok 32980 1727096593.02833: done checking to see if all hosts have failed 32980 1727096593.02833: getting the remaining hosts for this loop 32980 1727096593.02834: done getting the remaining hosts for this loop 32980 1727096593.02838: getting the next task for host managed_node2 32980 1727096593.02847: done getting next task for host managed_node2 32980 1727096593.02849: ^ task is: TASK: Install iproute 32980 1727096593.02852: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096593.02855: getting variables 32980 1727096593.02857: in VariableManager get_vars() 32980 1727096593.02896: Calling all_inventory to load vars for managed_node2 32980 1727096593.02899: Calling groups_inventory to load vars for managed_node2 32980 1727096593.02902: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096593.02913: Calling all_plugins_play to load vars for managed_node2 32980 1727096593.02916: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096593.02919: Calling groups_plugins_play to load vars for managed_node2 32980 1727096593.03589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096593.03794: done with get_vars() 32980 1727096593.03804: done getting variables 32980 1727096593.03898: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 09:03:13 -0400 (0:00:00.035) 0:00:04.965 ****** 32980 1727096593.03929: entering _queue_task() for managed_node2/package 32980 1727096593.04508: worker is 1 (out of 1 available) 32980 1727096593.04522: exiting _queue_task() for managed_node2/package 32980 1727096593.04533: done queuing things up, now waiting for results queue to drain 32980 1727096593.04534: waiting for pending results... 32980 1727096593.05095: running TaskExecutor() for managed_node2/TASK: Install iproute 32980 1727096593.05099: in run() - task 0afff68d-5257-457d-ef33-00000000021e 32980 1727096593.05103: variable 'ansible_search_path' from source: unknown 32980 1727096593.05105: variable 'ansible_search_path' from source: unknown 32980 1727096593.05108: calling self._execute() 32980 1727096593.05180: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.05213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.05229: variable 'omit' from source: magic vars 32980 1727096593.05602: variable 'ansible_distribution_major_version' from source: facts 32980 1727096593.05620: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096593.05630: variable 'omit' from source: magic vars 32980 1727096593.05717: variable 'omit' from source: magic vars 32980 1727096593.05866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096593.08782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096593.08926: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096593.09275: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096593.09280: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096593.09283: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096593.09381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096593.09420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096593.09452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096593.09514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096593.09616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096593.09649: variable '__network_is_ostree' from source: set_fact 32980 1727096593.09661: variable 'omit' from source: magic vars 32980 1727096593.09701: variable 'omit' from source: magic vars 32980 1727096593.09739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096593.09769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096593.09794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096593.09819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096593.09839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096593.09875: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096593.09885: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.09893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.09996: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096593.10007: Set connection var ansible_timeout to 10 32980 1727096593.10013: Set connection var ansible_shell_type to sh 32980 1727096593.10021: Set connection var ansible_connection to ssh 32980 1727096593.10049: Set connection var ansible_shell_executable to /bin/sh 32980 1727096593.10051: Set connection var ansible_pipelining to False 32980 1727096593.10072: variable 'ansible_shell_executable' from source: unknown 32980 1727096593.10158: variable 'ansible_connection' from source: unknown 32980 1727096593.10161: variable 'ansible_module_compression' from source: unknown 32980 1727096593.10163: variable 'ansible_shell_type' from source: unknown 32980 1727096593.10165: variable 'ansible_shell_executable' from source: unknown 32980 1727096593.10169: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.10171: variable 'ansible_pipelining' from source: unknown 32980 1727096593.10173: variable 'ansible_timeout' from source: unknown 32980 1727096593.10175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.10215: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096593.10229: variable 'omit' from source: magic vars 32980 1727096593.10240: starting attempt loop 32980 1727096593.10246: running the handler 32980 1727096593.10258: variable 'ansible_facts' from source: unknown 32980 1727096593.10274: variable 'ansible_facts' from source: unknown 32980 1727096593.10349: _low_level_execute_command(): starting 32980 1727096593.10361: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096593.11090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096593.11108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096593.11123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.11175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096593.11182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096593.11185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.11223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.12913: stdout chunk (state=3): >>>/root <<< 32980 1727096593.13141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096593.13145: stdout chunk (state=3): >>><<< 32980 1727096593.13147: stderr chunk (state=3): >>><<< 32980 1727096593.13151: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096593.13161: _low_level_execute_command(): starting 32980 1727096593.13164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313 `" && echo ansible-tmp-1727096593.1305625-33261-165250625959313="` echo /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313 `" ) && sleep 0' 32980 1727096593.14011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096593.14044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.14047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096593.14050: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096593.14052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.14102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096593.14105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.14154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.16120: stdout chunk (state=3): >>>ansible-tmp-1727096593.1305625-33261-165250625959313=/root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313 <<< 32980 1727096593.16309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096593.16314: stdout chunk (state=3): >>><<< 32980 1727096593.16316: stderr chunk (state=3): >>><<< 32980 1727096593.16610: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096593.1305625-33261-165250625959313=/root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096593.16618: variable 'ansible_module_compression' from source: unknown 32980 1727096593.16622: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 32980 1727096593.16625: ANSIBALLZ: Acquiring lock 32980 1727096593.16627: ANSIBALLZ: Lock acquired: 140258569802416 32980 1727096593.16629: ANSIBALLZ: Creating module 32980 1727096593.31907: ANSIBALLZ: Writing module into payload 32980 1727096593.32105: ANSIBALLZ: Writing module 32980 1727096593.32109: ANSIBALLZ: Renaming module 32980 1727096593.32122: ANSIBALLZ: Done creating module 32980 1727096593.32146: variable 'ansible_facts' from source: unknown 32980 1727096593.32262: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/AnsiballZ_dnf.py 32980 1727096593.32383: Sending initial data 32980 1727096593.32386: Sent initial data (152 bytes) 32980 1727096593.32949: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096593.32952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096593.32956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.32959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096593.32961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.33027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096593.33030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096593.33033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.33078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.34723: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096593.34751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096593.34787: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpnzotbf9x /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/AnsiballZ_dnf.py <<< 32980 1727096593.34790: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/AnsiballZ_dnf.py" <<< 32980 1727096593.34817: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpnzotbf9x" to remote "/root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/AnsiballZ_dnf.py" <<< 32980 1727096593.35444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096593.35494: stderr chunk (state=3): >>><<< 32980 1727096593.35497: stdout chunk (state=3): >>><<< 32980 1727096593.35516: done transferring module to remote 32980 1727096593.35524: _low_level_execute_command(): starting 32980 1727096593.35529: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/ /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/AnsiballZ_dnf.py && sleep 0' 32980 1727096593.35955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096593.35959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096593.35993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 32980 1727096593.35996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096593.35999: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.36054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096593.36061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096593.36063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.36095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.38005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096593.38009: stdout chunk (state=3): >>><<< 32980 1727096593.38012: stderr chunk (state=3): >>><<< 32980 1727096593.38014: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096593.38017: _low_level_execute_command(): starting 32980 1727096593.38019: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/AnsiballZ_dnf.py && sleep 0' 32980 1727096593.38875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096593.38879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.38882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.80804: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 32980 1727096593.84966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096593.84992: stdout chunk (state=3): >>><<< 32980 1727096593.85276: stderr chunk (state=3): >>><<< 32980 1727096593.85280: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096593.85287: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096593.85290: _low_level_execute_command(): starting 32980 1727096593.85292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096593.1305625-33261-165250625959313/ > /dev/null 2>&1 && sleep 0' 32980 1727096593.86639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096593.86664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096593.86716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.86744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.88622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096593.88677: stderr chunk (state=3): >>><<< 32980 1727096593.88692: stdout chunk (state=3): >>><<< 32980 1727096593.88715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096593.88722: handler run complete 32980 1727096593.88907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096593.89275: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096593.89278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096593.89280: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096593.89283: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096593.89285: variable '__install_status' from source: unknown 32980 1727096593.89287: Evaluated conditional (__install_status is success): True 32980 1727096593.89305: attempt loop complete, returning result 32980 1727096593.89307: _execute() done 32980 1727096593.89309: dumping result to json 32980 1727096593.89311: done dumping result, returning 32980 1727096593.89375: done running TaskExecutor() for managed_node2/TASK: Install iproute [0afff68d-5257-457d-ef33-00000000021e] 32980 1727096593.89378: sending task result for task 0afff68d-5257-457d-ef33-00000000021e 32980 1727096593.89444: done sending task result for task 0afff68d-5257-457d-ef33-00000000021e 32980 1727096593.89447: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 32980 1727096593.89534: no more pending results, returning what we have 32980 1727096593.89538: results queue empty 32980 1727096593.89539: checking for any_errors_fatal 32980 1727096593.89544: done checking for any_errors_fatal 32980 1727096593.89545: checking for max_fail_percentage 32980 1727096593.89546: done checking for max_fail_percentage 32980 1727096593.89547: checking to see if all hosts have failed and the running result is not ok 32980 1727096593.89548: done checking to see if all hosts have failed 32980 1727096593.89549: getting the remaining hosts for this loop 32980 1727096593.89550: done getting the remaining hosts for this loop 32980 1727096593.89555: getting the next task for host managed_node2 32980 1727096593.89562: done getting next task for host managed_node2 32980 1727096593.89565: ^ task is: TASK: Create veth interface {{ interface }} 32980 1727096593.89570: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096593.89573: getting variables 32980 1727096593.89575: in VariableManager get_vars() 32980 1727096593.89619: Calling all_inventory to load vars for managed_node2 32980 1727096593.89622: Calling groups_inventory to load vars for managed_node2 32980 1727096593.89625: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096593.89637: Calling all_plugins_play to load vars for managed_node2 32980 1727096593.89639: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096593.89642: Calling groups_plugins_play to load vars for managed_node2 32980 1727096593.90172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096593.90379: done with get_vars() 32980 1727096593.90389: done getting variables 32980 1727096593.90452: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096593.90566: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 09:03:13 -0400 (0:00:00.866) 0:00:05.832 ****** 32980 1727096593.90614: entering _queue_task() for managed_node2/command 32980 1727096593.91092: worker is 1 (out of 1 available) 32980 1727096593.91100: exiting _queue_task() for managed_node2/command 32980 1727096593.91109: done queuing things up, now waiting for results queue to drain 32980 1727096593.91110: waiting for pending results... 32980 1727096593.91152: running TaskExecutor() for managed_node2/TASK: Create veth interface lsr101 32980 1727096593.91260: in run() - task 0afff68d-5257-457d-ef33-00000000021f 32980 1727096593.91282: variable 'ansible_search_path' from source: unknown 32980 1727096593.91288: variable 'ansible_search_path' from source: unknown 32980 1727096593.91775: variable 'interface' from source: play vars 32980 1727096593.91989: variable 'interface' from source: play vars 32980 1727096593.91992: variable 'interface' from source: play vars 32980 1727096593.92074: Loaded config def from plugin (lookup/items) 32980 1727096593.92088: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 32980 1727096593.92112: variable 'omit' from source: magic vars 32980 1727096593.92225: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.92240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.92263: variable 'omit' from source: magic vars 32980 1727096593.92489: variable 'ansible_distribution_major_version' from source: facts 32980 1727096593.92502: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096593.92702: variable 'type' from source: play vars 32980 1727096593.92711: variable 'state' from source: include params 32980 1727096593.92719: variable 'interface' from source: play vars 32980 1727096593.92726: variable 'current_interfaces' from source: set_fact 32980 1727096593.92736: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32980 1727096593.92750: variable 'omit' from source: magic vars 32980 1727096593.92789: variable 'omit' from source: magic vars 32980 1727096593.92835: variable 'item' from source: unknown 32980 1727096593.92969: variable 'item' from source: unknown 32980 1727096593.92973: variable 'omit' from source: magic vars 32980 1727096593.92975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096593.93003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096593.93030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096593.93055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096593.93079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096593.93114: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096593.93124: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.93133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.93291: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096593.93294: Set connection var ansible_timeout to 10 32980 1727096593.93296: Set connection var ansible_shell_type to sh 32980 1727096593.93298: Set connection var ansible_connection to ssh 32980 1727096593.93301: Set connection var ansible_shell_executable to /bin/sh 32980 1727096593.93303: Set connection var ansible_pipelining to False 32980 1727096593.93305: variable 'ansible_shell_executable' from source: unknown 32980 1727096593.93306: variable 'ansible_connection' from source: unknown 32980 1727096593.93308: variable 'ansible_module_compression' from source: unknown 32980 1727096593.93310: variable 'ansible_shell_type' from source: unknown 32980 1727096593.93313: variable 'ansible_shell_executable' from source: unknown 32980 1727096593.93320: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096593.93327: variable 'ansible_pipelining' from source: unknown 32980 1727096593.93333: variable 'ansible_timeout' from source: unknown 32980 1727096593.93342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096593.93479: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096593.93494: variable 'omit' from source: magic vars 32980 1727096593.93617: starting attempt loop 32980 1727096593.93620: running the handler 32980 1727096593.93622: _low_level_execute_command(): starting 32980 1727096593.93624: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096593.94227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096593.94283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096593.94298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.94366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096593.94404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096593.94438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.94475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.96175: stdout chunk (state=3): >>>/root <<< 32980 1727096593.96372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096593.96378: stdout chunk (state=3): >>><<< 32980 1727096593.96381: stderr chunk (state=3): >>><<< 32980 1727096593.96502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096593.96506: _low_level_execute_command(): starting 32980 1727096593.96517: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255 `" && echo ansible-tmp-1727096593.9640446-33316-252841859149255="` echo /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255 `" ) && sleep 0' 32980 1727096593.97105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096593.97207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096593.97299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096593.97333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096593.97415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096593.99408: stdout chunk (state=3): >>>ansible-tmp-1727096593.9640446-33316-252841859149255=/root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255 <<< 32980 1727096593.99509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096593.99512: stdout chunk (state=3): >>><<< 32980 1727096593.99514: stderr chunk (state=3): >>><<< 32980 1727096593.99537: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096593.9640446-33316-252841859149255=/root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096593.99672: variable 'ansible_module_compression' from source: unknown 32980 1727096593.99675: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096593.99678: variable 'ansible_facts' from source: unknown 32980 1727096593.99758: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/AnsiballZ_command.py 32980 1727096593.99922: Sending initial data 32980 1727096593.99931: Sent initial data (156 bytes) 32980 1727096594.00363: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.00379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096594.00393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.00434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.00448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.00485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.02130: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096594.02182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096594.02230: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp5wvtlipa /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/AnsiballZ_command.py <<< 32980 1727096594.02234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/AnsiballZ_command.py" <<< 32980 1727096594.02313: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp5wvtlipa" to remote "/root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/AnsiballZ_command.py" <<< 32980 1727096594.03033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.03134: stderr chunk (state=3): >>><<< 32980 1727096594.03146: stdout chunk (state=3): >>><<< 32980 1727096594.03149: done transferring module to remote 32980 1727096594.03159: _low_level_execute_command(): starting 32980 1727096594.03172: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/ /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/AnsiballZ_command.py && sleep 0' 32980 1727096594.03656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.03663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096594.03694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.03697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.03699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.03702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.03751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.03754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.03793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.05756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.05759: stdout chunk (state=3): >>><<< 32980 1727096594.05766: stderr chunk (state=3): >>><<< 32980 1727096594.05772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.05774: _low_level_execute_command(): starting 32980 1727096594.05777: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/AnsiballZ_command.py && sleep 0' 32980 1727096594.06388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.06399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.06411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.06481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.22886: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-23 09:03:14.218239", "end": "2024-09-23 09:03:14.226017", "delta": "0:00:00.007778", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096594.25188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096594.25219: stderr chunk (state=3): >>><<< 32980 1727096594.25222: stdout chunk (state=3): >>><<< 32980 1727096594.25241: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-23 09:03:14.218239", "end": "2024-09-23 09:03:14.226017", "delta": "0:00:00.007778", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096594.25271: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr101 type veth peer name peerlsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096594.25279: _low_level_execute_command(): starting 32980 1727096594.25288: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096593.9640446-33316-252841859149255/ > /dev/null 2>&1 && sleep 0' 32980 1727096594.25747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.25751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.25754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096594.25756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.25758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.25811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.25815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.25817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.25860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.29989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.30012: stderr chunk (state=3): >>><<< 32980 1727096594.30015: stdout chunk (state=3): >>><<< 32980 1727096594.30033: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.30039: handler run complete 32980 1727096594.30060: Evaluated conditional (False): False 32980 1727096594.30069: attempt loop complete, returning result 32980 1727096594.30087: variable 'item' from source: unknown 32980 1727096594.30149: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101" ], "delta": "0:00:00.007778", "end": "2024-09-23 09:03:14.226017", "item": "ip link add lsr101 type veth peer name peerlsr101", "rc": 0, "start": "2024-09-23 09:03:14.218239" } 32980 1727096594.30319: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096594.30323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096594.30325: variable 'omit' from source: magic vars 32980 1727096594.30416: variable 'ansible_distribution_major_version' from source: facts 32980 1727096594.30419: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096594.30534: variable 'type' from source: play vars 32980 1727096594.30538: variable 'state' from source: include params 32980 1727096594.30541: variable 'interface' from source: play vars 32980 1727096594.30545: variable 'current_interfaces' from source: set_fact 32980 1727096594.30552: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32980 1727096594.30555: variable 'omit' from source: magic vars 32980 1727096594.30566: variable 'omit' from source: magic vars 32980 1727096594.30593: variable 'item' from source: unknown 32980 1727096594.30636: variable 'item' from source: unknown 32980 1727096594.30647: variable 'omit' from source: magic vars 32980 1727096594.30666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096594.30678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096594.30681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096594.30691: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096594.30694: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096594.30696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096594.30747: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096594.30750: Set connection var ansible_timeout to 10 32980 1727096594.30752: Set connection var ansible_shell_type to sh 32980 1727096594.30755: Set connection var ansible_connection to ssh 32980 1727096594.30761: Set connection var ansible_shell_executable to /bin/sh 32980 1727096594.30765: Set connection var ansible_pipelining to False 32980 1727096594.30784: variable 'ansible_shell_executable' from source: unknown 32980 1727096594.30787: variable 'ansible_connection' from source: unknown 32980 1727096594.30789: variable 'ansible_module_compression' from source: unknown 32980 1727096594.30791: variable 'ansible_shell_type' from source: unknown 32980 1727096594.30794: variable 'ansible_shell_executable' from source: unknown 32980 1727096594.30796: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096594.30798: variable 'ansible_pipelining' from source: unknown 32980 1727096594.30802: variable 'ansible_timeout' from source: unknown 32980 1727096594.30806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096594.30869: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096594.30879: variable 'omit' from source: magic vars 32980 1727096594.30882: starting attempt loop 32980 1727096594.30884: running the handler 32980 1727096594.30891: _low_level_execute_command(): starting 32980 1727096594.30894: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096594.31328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.31357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096594.31364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.31367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.31372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.31422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.31425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.31427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.31472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.33127: stdout chunk (state=3): >>>/root <<< 32980 1727096594.33222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.33250: stderr chunk (state=3): >>><<< 32980 1727096594.33253: stdout chunk (state=3): >>><<< 32980 1727096594.33270: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.33282: _low_level_execute_command(): starting 32980 1727096594.33287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875 `" && echo ansible-tmp-1727096594.3327053-33316-163560237696875="` echo /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875 `" ) && sleep 0' 32980 1727096594.33727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.33730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.33733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096594.33735: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.33737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.33786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.33789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.33826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.35731: stdout chunk (state=3): >>>ansible-tmp-1727096594.3327053-33316-163560237696875=/root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875 <<< 32980 1727096594.35833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.35864: stderr chunk (state=3): >>><<< 32980 1727096594.35870: stdout chunk (state=3): >>><<< 32980 1727096594.35884: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096594.3327053-33316-163560237696875=/root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.35904: variable 'ansible_module_compression' from source: unknown 32980 1727096594.35933: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096594.35949: variable 'ansible_facts' from source: unknown 32980 1727096594.36005: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/AnsiballZ_command.py 32980 1727096594.36096: Sending initial data 32980 1727096594.36099: Sent initial data (156 bytes) 32980 1727096594.36543: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.36546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096594.36549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.36551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.36553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.36608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.36616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.36619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.36647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.38208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32980 1727096594.38221: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096594.38240: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096594.38273: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp5ka_pk4x /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/AnsiballZ_command.py <<< 32980 1727096594.38286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/AnsiballZ_command.py" <<< 32980 1727096594.38307: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp5ka_pk4x" to remote "/root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/AnsiballZ_command.py" <<< 32980 1727096594.38309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/AnsiballZ_command.py" <<< 32980 1727096594.38790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.38832: stderr chunk (state=3): >>><<< 32980 1727096594.38835: stdout chunk (state=3): >>><<< 32980 1727096594.38879: done transferring module to remote 32980 1727096594.38886: _low_level_execute_command(): starting 32980 1727096594.38891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/ /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/AnsiballZ_command.py && sleep 0' 32980 1727096594.39327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.39360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096594.39363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096594.39365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096594.39369: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096594.39372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.39418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.39421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.39458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.41300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.41305: stdout chunk (state=3): >>><<< 32980 1727096594.41309: stderr chunk (state=3): >>><<< 32980 1727096594.41476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.41480: _low_level_execute_command(): starting 32980 1727096594.41483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/AnsiballZ_command.py && sleep 0' 32980 1727096594.42918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096594.43010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.43155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.43190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.43306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.59212: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-23 09:03:14.587027", "end": "2024-09-23 09:03:14.591046", "delta": "0:00:00.004019", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096594.60890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096594.61224: stdout chunk (state=3): >>><<< 32980 1727096594.61227: stderr chunk (state=3): >>><<< 32980 1727096594.61230: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-23 09:03:14.587027", "end": "2024-09-23 09:03:14.591046", "delta": "0:00:00.004019", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096594.61233: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096594.61235: _low_level_execute_command(): starting 32980 1727096594.61237: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096594.3327053-33316-163560237696875/ > /dev/null 2>&1 && sleep 0' 32980 1727096594.62352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.62487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.62528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.62540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.62621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.62693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.65175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.65181: stdout chunk (state=3): >>><<< 32980 1727096594.65184: stderr chunk (state=3): >>><<< 32980 1727096594.65187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.65190: handler run complete 32980 1727096594.65192: Evaluated conditional (False): False 32980 1727096594.65195: attempt loop complete, returning result 32980 1727096594.65197: variable 'item' from source: unknown 32980 1727096594.65199: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr101", "up" ], "delta": "0:00:00.004019", "end": "2024-09-23 09:03:14.591046", "item": "ip link set peerlsr101 up", "rc": 0, "start": "2024-09-23 09:03:14.587027" } 32980 1727096594.65682: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096594.65686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096594.65688: variable 'omit' from source: magic vars 32980 1727096594.65938: variable 'ansible_distribution_major_version' from source: facts 32980 1727096594.65950: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096594.66345: variable 'type' from source: play vars 32980 1727096594.66356: variable 'state' from source: include params 32980 1727096594.66364: variable 'interface' from source: play vars 32980 1727096594.66375: variable 'current_interfaces' from source: set_fact 32980 1727096594.66387: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32980 1727096594.66396: variable 'omit' from source: magic vars 32980 1727096594.66418: variable 'omit' from source: magic vars 32980 1727096594.66462: variable 'item' from source: unknown 32980 1727096594.66878: variable 'item' from source: unknown 32980 1727096594.66882: variable 'omit' from source: magic vars 32980 1727096594.66885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096594.66887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096594.66895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096594.66897: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096594.66900: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096594.66902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096594.67272: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096594.67275: Set connection var ansible_timeout to 10 32980 1727096594.67278: Set connection var ansible_shell_type to sh 32980 1727096594.67280: Set connection var ansible_connection to ssh 32980 1727096594.67282: Set connection var ansible_shell_executable to /bin/sh 32980 1727096594.67284: Set connection var ansible_pipelining to False 32980 1727096594.67286: variable 'ansible_shell_executable' from source: unknown 32980 1727096594.67287: variable 'ansible_connection' from source: unknown 32980 1727096594.67289: variable 'ansible_module_compression' from source: unknown 32980 1727096594.67291: variable 'ansible_shell_type' from source: unknown 32980 1727096594.67293: variable 'ansible_shell_executable' from source: unknown 32980 1727096594.67295: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096594.67296: variable 'ansible_pipelining' from source: unknown 32980 1727096594.67298: variable 'ansible_timeout' from source: unknown 32980 1727096594.67300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096594.67302: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096594.67304: variable 'omit' from source: magic vars 32980 1727096594.67306: starting attempt loop 32980 1727096594.67308: running the handler 32980 1727096594.67310: _low_level_execute_command(): starting 32980 1727096594.67478: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096594.68669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.68784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.68825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.68982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.69042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.70728: stdout chunk (state=3): >>>/root <<< 32980 1727096594.70829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.70866: stderr chunk (state=3): >>><<< 32980 1727096594.70988: stdout chunk (state=3): >>><<< 32980 1727096594.71086: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.71089: _low_level_execute_command(): starting 32980 1727096594.71092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420 `" && echo ansible-tmp-1727096594.7100534-33316-146192289846420="` echo /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420 `" ) && sleep 0' 32980 1727096594.72104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.72116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.72283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.72372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.72453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.72505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.74541: stdout chunk (state=3): >>>ansible-tmp-1727096594.7100534-33316-146192289846420=/root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420 <<< 32980 1727096594.74585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.74622: stderr chunk (state=3): >>><<< 32980 1727096594.74734: stdout chunk (state=3): >>><<< 32980 1727096594.75175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096594.7100534-33316-146192289846420=/root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.75182: variable 'ansible_module_compression' from source: unknown 32980 1727096594.75185: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096594.75187: variable 'ansible_facts' from source: unknown 32980 1727096594.75189: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/AnsiballZ_command.py 32980 1727096594.75495: Sending initial data 32980 1727096594.75499: Sent initial data (156 bytes) 32980 1727096594.76435: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096594.76781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.76827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.76858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.78486: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096594.78516: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096594.78548: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpa90ok8xl /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/AnsiballZ_command.py <<< 32980 1727096594.78787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/AnsiballZ_command.py" <<< 32980 1727096594.78790: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpa90ok8xl" to remote "/root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/AnsiballZ_command.py" <<< 32980 1727096594.79872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.80010: stderr chunk (state=3): >>><<< 32980 1727096594.80013: stdout chunk (state=3): >>><<< 32980 1727096594.80016: done transferring module to remote 32980 1727096594.80018: _low_level_execute_command(): starting 32980 1727096594.80020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/ /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/AnsiballZ_command.py && sleep 0' 32980 1727096594.81022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.81026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096594.81029: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.81147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096594.81157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096594.81285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.81370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.81417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096594.83249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096594.83444: stderr chunk (state=3): >>><<< 32980 1727096594.83448: stdout chunk (state=3): >>><<< 32980 1727096594.83541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096594.83545: _low_level_execute_command(): starting 32980 1727096594.83547: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/AnsiballZ_command.py && sleep 0' 32980 1727096594.84662: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096594.84977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096594.84997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096594.85023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096594.85100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.00862: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-23 09:03:15.003393", "end": "2024-09-23 09:03:15.007357", "delta": "0:00:00.003964", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096595.02562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096595.02580: stdout chunk (state=3): >>><<< 32980 1727096595.02594: stderr chunk (state=3): >>><<< 32980 1727096595.02615: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-23 09:03:15.003393", "end": "2024-09-23 09:03:15.007357", "delta": "0:00:00.003964", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096595.02989: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096595.02993: _low_level_execute_command(): starting 32980 1727096595.02995: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096594.7100534-33316-146192289846420/ > /dev/null 2>&1 && sleep 0' 32980 1727096595.04188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.04428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096595.04451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.04519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.06456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096595.06466: stdout chunk (state=3): >>><<< 32980 1727096595.06487: stderr chunk (state=3): >>><<< 32980 1727096595.06510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096595.06513: handler run complete 32980 1727096595.06535: Evaluated conditional (False): False 32980 1727096595.06544: attempt loop complete, returning result 32980 1727096595.06563: variable 'item' from source: unknown 32980 1727096595.06644: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr101", "up" ], "delta": "0:00:00.003964", "end": "2024-09-23 09:03:15.007357", "item": "ip link set lsr101 up", "rc": 0, "start": "2024-09-23 09:03:15.003393" } 32980 1727096595.06993: dumping result to json 32980 1727096595.07002: done dumping result, returning 32980 1727096595.07011: done running TaskExecutor() for managed_node2/TASK: Create veth interface lsr101 [0afff68d-5257-457d-ef33-00000000021f] 32980 1727096595.07016: sending task result for task 0afff68d-5257-457d-ef33-00000000021f 32980 1727096595.07253: no more pending results, returning what we have 32980 1727096595.07257: results queue empty 32980 1727096595.07258: checking for any_errors_fatal 32980 1727096595.07264: done checking for any_errors_fatal 32980 1727096595.07265: checking for max_fail_percentage 32980 1727096595.07267: done checking for max_fail_percentage 32980 1727096595.07475: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.07476: done checking to see if all hosts have failed 32980 1727096595.07477: getting the remaining hosts for this loop 32980 1727096595.07479: done getting the remaining hosts for this loop 32980 1727096595.07482: getting the next task for host managed_node2 32980 1727096595.07489: done getting next task for host managed_node2 32980 1727096595.07491: ^ task is: TASK: Set up veth as managed by NetworkManager 32980 1727096595.07494: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.07497: getting variables 32980 1727096595.07498: in VariableManager get_vars() 32980 1727096595.07532: Calling all_inventory to load vars for managed_node2 32980 1727096595.07535: Calling groups_inventory to load vars for managed_node2 32980 1727096595.07537: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.07545: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.07547: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.07550: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.07846: done sending task result for task 0afff68d-5257-457d-ef33-00000000021f 32980 1727096595.07851: WORKER PROCESS EXITING 32980 1727096595.07876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.08312: done with get_vars() 32980 1727096595.08323: done getting variables 32980 1727096595.08423: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 09:03:15 -0400 (0:00:01.179) 0:00:07.012 ****** 32980 1727096595.08574: entering _queue_task() for managed_node2/command 32980 1727096595.09280: worker is 1 (out of 1 available) 32980 1727096595.09291: exiting _queue_task() for managed_node2/command 32980 1727096595.09305: done queuing things up, now waiting for results queue to drain 32980 1727096595.09307: waiting for pending results... 32980 1727096595.09856: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 32980 1727096595.09951: in run() - task 0afff68d-5257-457d-ef33-000000000220 32980 1727096595.09973: variable 'ansible_search_path' from source: unknown 32980 1727096595.09977: variable 'ansible_search_path' from source: unknown 32980 1727096595.10221: calling self._execute() 32980 1727096595.10307: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.10313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.10584: variable 'omit' from source: magic vars 32980 1727096595.11129: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.11143: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.11398: variable 'type' from source: play vars 32980 1727096595.11401: variable 'state' from source: include params 32980 1727096595.11408: Evaluated conditional (type == 'veth' and state == 'present'): True 32980 1727096595.11414: variable 'omit' from source: magic vars 32980 1727096595.11454: variable 'omit' from source: magic vars 32980 1727096595.11550: variable 'interface' from source: play vars 32980 1727096595.11569: variable 'omit' from source: magic vars 32980 1727096595.11913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096595.11947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096595.11966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096595.11987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096595.12000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096595.12025: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096595.12028: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.12033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.12339: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096595.12344: Set connection var ansible_timeout to 10 32980 1727096595.12347: Set connection var ansible_shell_type to sh 32980 1727096595.12349: Set connection var ansible_connection to ssh 32980 1727096595.12356: Set connection var ansible_shell_executable to /bin/sh 32980 1727096595.12361: Set connection var ansible_pipelining to False 32980 1727096595.12388: variable 'ansible_shell_executable' from source: unknown 32980 1727096595.12391: variable 'ansible_connection' from source: unknown 32980 1727096595.12393: variable 'ansible_module_compression' from source: unknown 32980 1727096595.12396: variable 'ansible_shell_type' from source: unknown 32980 1727096595.12398: variable 'ansible_shell_executable' from source: unknown 32980 1727096595.12400: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.12404: variable 'ansible_pipelining' from source: unknown 32980 1727096595.12406: variable 'ansible_timeout' from source: unknown 32980 1727096595.12411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.12745: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096595.12764: variable 'omit' from source: magic vars 32980 1727096595.12766: starting attempt loop 32980 1727096595.12771: running the handler 32980 1727096595.12973: _low_level_execute_command(): starting 32980 1727096595.12976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096595.14184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096595.14193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096595.14294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.14308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096595.14387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096595.14487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.14539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.16222: stdout chunk (state=3): >>>/root <<< 32980 1727096595.16395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096595.16613: stderr chunk (state=3): >>><<< 32980 1727096595.16617: stdout chunk (state=3): >>><<< 32980 1727096595.16621: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096595.16624: _low_level_execute_command(): starting 32980 1727096595.16627: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188 `" && echo ansible-tmp-1727096595.16515-33370-109645183217188="` echo /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188 `" ) && sleep 0' 32980 1727096595.17726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096595.17742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096595.17762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.18106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.18110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.20089: stdout chunk (state=3): >>>ansible-tmp-1727096595.16515-33370-109645183217188=/root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188 <<< 32980 1727096595.20219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096595.20255: stderr chunk (state=3): >>><<< 32980 1727096595.20265: stdout chunk (state=3): >>><<< 32980 1727096595.20675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096595.16515-33370-109645183217188=/root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096595.20680: variable 'ansible_module_compression' from source: unknown 32980 1727096595.20683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096595.20685: variable 'ansible_facts' from source: unknown 32980 1727096595.20762: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/AnsiballZ_command.py 32980 1727096595.21191: Sending initial data 32980 1727096595.21201: Sent initial data (154 bytes) 32980 1727096595.22126: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096595.22148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096595.22185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096595.22316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.22431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096595.22485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.22546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.24178: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096595.24212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096595.24270: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp6090bffn /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/AnsiballZ_command.py <<< 32980 1727096595.24275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/AnsiballZ_command.py" <<< 32980 1727096595.24325: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp6090bffn" to remote "/root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/AnsiballZ_command.py" <<< 32980 1727096595.25023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096595.25109: stderr chunk (state=3): >>><<< 32980 1727096595.25122: stdout chunk (state=3): >>><<< 32980 1727096595.25174: done transferring module to remote 32980 1727096595.25191: _low_level_execute_command(): starting 32980 1727096595.25202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/ /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/AnsiballZ_command.py && sleep 0' 32980 1727096595.25917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096595.25931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096595.25947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096595.26007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.26117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.26285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096595.26309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096595.26362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.26410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.28329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096595.28333: stdout chunk (state=3): >>><<< 32980 1727096595.28335: stderr chunk (state=3): >>><<< 32980 1727096595.28478: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096595.28482: _low_level_execute_command(): starting 32980 1727096595.28485: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/AnsiballZ_command.py && sleep 0' 32980 1727096595.29677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096595.29696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.29835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096595.29893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.29929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.47581: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-23 09:03:15.456043", "end": "2024-09-23 09:03:15.474543", "delta": "0:00:00.018500", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096595.49379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096595.49395: stdout chunk (state=3): >>><<< 32980 1727096595.49410: stderr chunk (state=3): >>><<< 32980 1727096595.49438: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-23 09:03:15.456043", "end": "2024-09-23 09:03:15.474543", "delta": "0:00:00.018500", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096595.49779: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr101 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096595.49783: _low_level_execute_command(): starting 32980 1727096595.49785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096595.16515-33370-109645183217188/ > /dev/null 2>&1 && sleep 0' 32980 1727096595.50927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096595.50946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096595.50961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096595.51090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096595.51285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.51356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.53255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096595.53484: stderr chunk (state=3): >>><<< 32980 1727096595.53487: stdout chunk (state=3): >>><<< 32980 1727096595.53490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096595.53492: handler run complete 32980 1727096595.53513: Evaluated conditional (False): False 32980 1727096595.53528: attempt loop complete, returning result 32980 1727096595.53677: _execute() done 32980 1727096595.53680: dumping result to json 32980 1727096595.53682: done dumping result, returning 32980 1727096595.53684: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-457d-ef33-000000000220] 32980 1727096595.53686: sending task result for task 0afff68d-5257-457d-ef33-000000000220 32980 1727096595.53758: done sending task result for task 0afff68d-5257-457d-ef33-000000000220 32980 1727096595.53761: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr101", "managed", "true" ], "delta": "0:00:00.018500", "end": "2024-09-23 09:03:15.474543", "rc": 0, "start": "2024-09-23 09:03:15.456043" } 32980 1727096595.53831: no more pending results, returning what we have 32980 1727096595.53834: results queue empty 32980 1727096595.53835: checking for any_errors_fatal 32980 1727096595.53846: done checking for any_errors_fatal 32980 1727096595.53847: checking for max_fail_percentage 32980 1727096595.53849: done checking for max_fail_percentage 32980 1727096595.53850: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.53851: done checking to see if all hosts have failed 32980 1727096595.53852: getting the remaining hosts for this loop 32980 1727096595.53853: done getting the remaining hosts for this loop 32980 1727096595.53857: getting the next task for host managed_node2 32980 1727096595.53865: done getting next task for host managed_node2 32980 1727096595.53870: ^ task is: TASK: Delete veth interface {{ interface }} 32980 1727096595.53874: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.53878: getting variables 32980 1727096595.53880: in VariableManager get_vars() 32980 1727096595.53918: Calling all_inventory to load vars for managed_node2 32980 1727096595.53922: Calling groups_inventory to load vars for managed_node2 32980 1727096595.53924: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.53935: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.53937: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.53940: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.54520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.54909: done with get_vars() 32980 1727096595.54922: done getting variables 32980 1727096595.55071: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096595.55357: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 09:03:15 -0400 (0:00:00.468) 0:00:07.480 ****** 32980 1727096595.55396: entering _queue_task() for managed_node2/command 32980 1727096595.56115: worker is 1 (out of 1 available) 32980 1727096595.56127: exiting _queue_task() for managed_node2/command 32980 1727096595.56138: done queuing things up, now waiting for results queue to drain 32980 1727096595.56171: waiting for pending results... 32980 1727096595.56785: running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr101 32980 1727096595.56789: in run() - task 0afff68d-5257-457d-ef33-000000000221 32980 1727096595.56792: variable 'ansible_search_path' from source: unknown 32980 1727096595.56795: variable 'ansible_search_path' from source: unknown 32980 1727096595.56797: calling self._execute() 32980 1727096595.56982: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.57175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.57180: variable 'omit' from source: magic vars 32980 1727096595.57723: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.57796: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.58171: variable 'type' from source: play vars 32980 1727096595.58477: variable 'state' from source: include params 32980 1727096595.58480: variable 'interface' from source: play vars 32980 1727096595.58483: variable 'current_interfaces' from source: set_fact 32980 1727096595.58490: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 32980 1727096595.58492: when evaluation is False, skipping this task 32980 1727096595.58494: _execute() done 32980 1727096595.58496: dumping result to json 32980 1727096595.58497: done dumping result, returning 32980 1727096595.58499: done running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr101 [0afff68d-5257-457d-ef33-000000000221] 32980 1727096595.58500: sending task result for task 0afff68d-5257-457d-ef33-000000000221 32980 1727096595.58563: done sending task result for task 0afff68d-5257-457d-ef33-000000000221 32980 1727096595.58566: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096595.58626: no more pending results, returning what we have 32980 1727096595.58629: results queue empty 32980 1727096595.58630: checking for any_errors_fatal 32980 1727096595.58638: done checking for any_errors_fatal 32980 1727096595.58639: checking for max_fail_percentage 32980 1727096595.58641: done checking for max_fail_percentage 32980 1727096595.58641: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.58642: done checking to see if all hosts have failed 32980 1727096595.58643: getting the remaining hosts for this loop 32980 1727096595.58645: done getting the remaining hosts for this loop 32980 1727096595.58650: getting the next task for host managed_node2 32980 1727096595.58658: done getting next task for host managed_node2 32980 1727096595.58661: ^ task is: TASK: Create dummy interface {{ interface }} 32980 1727096595.58665: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.58671: getting variables 32980 1727096595.58672: in VariableManager get_vars() 32980 1727096595.58836: Calling all_inventory to load vars for managed_node2 32980 1727096595.58839: Calling groups_inventory to load vars for managed_node2 32980 1727096595.58840: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.58849: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.58851: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.58854: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.59227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.59719: done with get_vars() 32980 1727096595.59729: done getting variables 32980 1727096595.59795: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096595.60119: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 09:03:15 -0400 (0:00:00.047) 0:00:07.528 ****** 32980 1727096595.60148: entering _queue_task() for managed_node2/command 32980 1727096595.60902: worker is 1 (out of 1 available) 32980 1727096595.60915: exiting _queue_task() for managed_node2/command 32980 1727096595.60927: done queuing things up, now waiting for results queue to drain 32980 1727096595.60928: waiting for pending results... 32980 1727096595.61345: running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr101 32980 1727096595.61533: in run() - task 0afff68d-5257-457d-ef33-000000000222 32980 1727096595.61781: variable 'ansible_search_path' from source: unknown 32980 1727096595.61785: variable 'ansible_search_path' from source: unknown 32980 1727096595.61787: calling self._execute() 32980 1727096595.61877: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.61889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.61952: variable 'omit' from source: magic vars 32980 1727096595.62635: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.62713: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.63238: variable 'type' from source: play vars 32980 1727096595.63242: variable 'state' from source: include params 32980 1727096595.63245: variable 'interface' from source: play vars 32980 1727096595.63247: variable 'current_interfaces' from source: set_fact 32980 1727096595.63250: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 32980 1727096595.63252: when evaluation is False, skipping this task 32980 1727096595.63254: _execute() done 32980 1727096595.63256: dumping result to json 32980 1727096595.63259: done dumping result, returning 32980 1727096595.63261: done running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr101 [0afff68d-5257-457d-ef33-000000000222] 32980 1727096595.63263: sending task result for task 0afff68d-5257-457d-ef33-000000000222 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096595.63420: no more pending results, returning what we have 32980 1727096595.63423: results queue empty 32980 1727096595.63424: checking for any_errors_fatal 32980 1727096595.63431: done checking for any_errors_fatal 32980 1727096595.63432: checking for max_fail_percentage 32980 1727096595.63434: done checking for max_fail_percentage 32980 1727096595.63435: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.63435: done checking to see if all hosts have failed 32980 1727096595.63436: getting the remaining hosts for this loop 32980 1727096595.63438: done getting the remaining hosts for this loop 32980 1727096595.63441: getting the next task for host managed_node2 32980 1727096595.63456: done getting next task for host managed_node2 32980 1727096595.63459: ^ task is: TASK: Delete dummy interface {{ interface }} 32980 1727096595.63463: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.63467: getting variables 32980 1727096595.63470: in VariableManager get_vars() 32980 1727096595.63516: Calling all_inventory to load vars for managed_node2 32980 1727096595.63519: Calling groups_inventory to load vars for managed_node2 32980 1727096595.63522: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.63535: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.63539: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.63542: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.63919: done sending task result for task 0afff68d-5257-457d-ef33-000000000222 32980 1727096595.63922: WORKER PROCESS EXITING 32980 1727096595.63942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.64362: done with get_vars() 32980 1727096595.64438: done getting variables 32980 1727096595.64525: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096595.64824: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 09:03:15 -0400 (0:00:00.047) 0:00:07.575 ****** 32980 1727096595.64853: entering _queue_task() for managed_node2/command 32980 1727096595.65529: worker is 1 (out of 1 available) 32980 1727096595.65543: exiting _queue_task() for managed_node2/command 32980 1727096595.65555: done queuing things up, now waiting for results queue to drain 32980 1727096595.65557: waiting for pending results... 32980 1727096595.65989: running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr101 32980 1727096595.66206: in run() - task 0afff68d-5257-457d-ef33-000000000223 32980 1727096595.66225: variable 'ansible_search_path' from source: unknown 32980 1727096595.66233: variable 'ansible_search_path' from source: unknown 32980 1727096595.66491: calling self._execute() 32980 1727096595.66495: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.66498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.66500: variable 'omit' from source: magic vars 32980 1727096595.67240: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.67424: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.67860: variable 'type' from source: play vars 32980 1727096595.67875: variable 'state' from source: include params 32980 1727096595.67885: variable 'interface' from source: play vars 32980 1727096595.67893: variable 'current_interfaces' from source: set_fact 32980 1727096595.67905: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 32980 1727096595.67912: when evaluation is False, skipping this task 32980 1727096595.67919: _execute() done 32980 1727096595.67925: dumping result to json 32980 1727096595.67932: done dumping result, returning 32980 1727096595.67941: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr101 [0afff68d-5257-457d-ef33-000000000223] 32980 1727096595.67955: sending task result for task 0afff68d-5257-457d-ef33-000000000223 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096595.68105: no more pending results, returning what we have 32980 1727096595.68108: results queue empty 32980 1727096595.68109: checking for any_errors_fatal 32980 1727096595.68118: done checking for any_errors_fatal 32980 1727096595.68119: checking for max_fail_percentage 32980 1727096595.68120: done checking for max_fail_percentage 32980 1727096595.68121: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.68122: done checking to see if all hosts have failed 32980 1727096595.68123: getting the remaining hosts for this loop 32980 1727096595.68125: done getting the remaining hosts for this loop 32980 1727096595.68129: getting the next task for host managed_node2 32980 1727096595.68137: done getting next task for host managed_node2 32980 1727096595.68140: ^ task is: TASK: Create tap interface {{ interface }} 32980 1727096595.68143: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.68148: getting variables 32980 1727096595.68149: in VariableManager get_vars() 32980 1727096595.68398: Calling all_inventory to load vars for managed_node2 32980 1727096595.68401: Calling groups_inventory to load vars for managed_node2 32980 1727096595.68404: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.68411: done sending task result for task 0afff68d-5257-457d-ef33-000000000223 32980 1727096595.68415: WORKER PROCESS EXITING 32980 1727096595.68429: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.68431: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.68434: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.68957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.69314: done with get_vars() 32980 1727096595.69324: done getting variables 32980 1727096595.69504: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096595.69732: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 09:03:15 -0400 (0:00:00.049) 0:00:07.624 ****** 32980 1727096595.69761: entering _queue_task() for managed_node2/command 32980 1727096595.70418: worker is 1 (out of 1 available) 32980 1727096595.70431: exiting _queue_task() for managed_node2/command 32980 1727096595.70441: done queuing things up, now waiting for results queue to drain 32980 1727096595.70442: waiting for pending results... 32980 1727096595.71006: running TaskExecutor() for managed_node2/TASK: Create tap interface lsr101 32980 1727096595.71359: in run() - task 0afff68d-5257-457d-ef33-000000000224 32980 1727096595.71561: variable 'ansible_search_path' from source: unknown 32980 1727096595.71564: variable 'ansible_search_path' from source: unknown 32980 1727096595.71569: calling self._execute() 32980 1727096595.71780: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.71783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.71785: variable 'omit' from source: magic vars 32980 1727096595.72612: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.72688: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.72982: variable 'type' from source: play vars 32980 1727096595.72992: variable 'state' from source: include params 32980 1727096595.73000: variable 'interface' from source: play vars 32980 1727096595.73008: variable 'current_interfaces' from source: set_fact 32980 1727096595.73020: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 32980 1727096595.73026: when evaluation is False, skipping this task 32980 1727096595.73043: _execute() done 32980 1727096595.73052: dumping result to json 32980 1727096595.73058: done dumping result, returning 32980 1727096595.73069: done running TaskExecutor() for managed_node2/TASK: Create tap interface lsr101 [0afff68d-5257-457d-ef33-000000000224] 32980 1727096595.73088: sending task result for task 0afff68d-5257-457d-ef33-000000000224 32980 1727096595.73284: done sending task result for task 0afff68d-5257-457d-ef33-000000000224 32980 1727096595.73295: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096595.73342: no more pending results, returning what we have 32980 1727096595.73345: results queue empty 32980 1727096595.73346: checking for any_errors_fatal 32980 1727096595.73352: done checking for any_errors_fatal 32980 1727096595.73353: checking for max_fail_percentage 32980 1727096595.73354: done checking for max_fail_percentage 32980 1727096595.73355: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.73356: done checking to see if all hosts have failed 32980 1727096595.73356: getting the remaining hosts for this loop 32980 1727096595.73358: done getting the remaining hosts for this loop 32980 1727096595.73361: getting the next task for host managed_node2 32980 1727096595.73371: done getting next task for host managed_node2 32980 1727096595.73373: ^ task is: TASK: Delete tap interface {{ interface }} 32980 1727096595.73377: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.73380: getting variables 32980 1727096595.73382: in VariableManager get_vars() 32980 1727096595.73422: Calling all_inventory to load vars for managed_node2 32980 1727096595.73425: Calling groups_inventory to load vars for managed_node2 32980 1727096595.73427: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.73437: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.73439: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.73442: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.73666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.73891: done with get_vars() 32980 1727096595.73902: done getting variables 32980 1727096595.73968: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096595.74090: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 09:03:15 -0400 (0:00:00.043) 0:00:07.667 ****** 32980 1727096595.74119: entering _queue_task() for managed_node2/command 32980 1727096595.74448: worker is 1 (out of 1 available) 32980 1727096595.74459: exiting _queue_task() for managed_node2/command 32980 1727096595.74613: done queuing things up, now waiting for results queue to drain 32980 1727096595.74615: waiting for pending results... 32980 1727096595.75093: running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr101 32980 1727096595.75270: in run() - task 0afff68d-5257-457d-ef33-000000000225 32980 1727096595.75298: variable 'ansible_search_path' from source: unknown 32980 1727096595.75305: variable 'ansible_search_path' from source: unknown 32980 1727096595.75441: calling self._execute() 32980 1727096595.75682: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.75776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.75781: variable 'omit' from source: magic vars 32980 1727096595.76441: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.76531: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.76977: variable 'type' from source: play vars 32980 1727096595.77023: variable 'state' from source: include params 32980 1727096595.77040: variable 'interface' from source: play vars 32980 1727096595.77049: variable 'current_interfaces' from source: set_fact 32980 1727096595.77063: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 32980 1727096595.77071: when evaluation is False, skipping this task 32980 1727096595.77082: _execute() done 32980 1727096595.77099: dumping result to json 32980 1727096595.77102: done dumping result, returning 32980 1727096595.77151: done running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr101 [0afff68d-5257-457d-ef33-000000000225] 32980 1727096595.77155: sending task result for task 0afff68d-5257-457d-ef33-000000000225 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096595.77415: no more pending results, returning what we have 32980 1727096595.77424: results queue empty 32980 1727096595.77425: checking for any_errors_fatal 32980 1727096595.77433: done checking for any_errors_fatal 32980 1727096595.77434: checking for max_fail_percentage 32980 1727096595.77436: done checking for max_fail_percentage 32980 1727096595.77437: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.77438: done checking to see if all hosts have failed 32980 1727096595.77439: getting the remaining hosts for this loop 32980 1727096595.77440: done getting the remaining hosts for this loop 32980 1727096595.77444: getting the next task for host managed_node2 32980 1727096595.77454: done getting next task for host managed_node2 32980 1727096595.77457: ^ task is: TASK: Include the task 'assert_device_present.yml' 32980 1727096595.77460: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.77464: getting variables 32980 1727096595.77466: in VariableManager get_vars() 32980 1727096595.77616: Calling all_inventory to load vars for managed_node2 32980 1727096595.77619: Calling groups_inventory to load vars for managed_node2 32980 1727096595.77622: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.77634: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.77637: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.77640: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.77957: done sending task result for task 0afff68d-5257-457d-ef33-000000000225 32980 1727096595.77960: WORKER PROCESS EXITING 32980 1727096595.77989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.78199: done with get_vars() 32980 1727096595.78215: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:16 Monday 23 September 2024 09:03:15 -0400 (0:00:00.041) 0:00:07.709 ****** 32980 1727096595.78309: entering _queue_task() for managed_node2/include_tasks 32980 1727096595.78607: worker is 1 (out of 1 available) 32980 1727096595.78618: exiting _queue_task() for managed_node2/include_tasks 32980 1727096595.78629: done queuing things up, now waiting for results queue to drain 32980 1727096595.78630: waiting for pending results... 32980 1727096595.78900: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 32980 1727096595.79001: in run() - task 0afff68d-5257-457d-ef33-00000000000d 32980 1727096595.79018: variable 'ansible_search_path' from source: unknown 32980 1727096595.79055: calling self._execute() 32980 1727096595.79153: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.79164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.79181: variable 'omit' from source: magic vars 32980 1727096595.79556: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.79575: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.79586: _execute() done 32980 1727096595.79594: dumping result to json 32980 1727096595.79601: done dumping result, returning 32980 1727096595.79611: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-457d-ef33-00000000000d] 32980 1727096595.79619: sending task result for task 0afff68d-5257-457d-ef33-00000000000d 32980 1727096595.79769: no more pending results, returning what we have 32980 1727096595.79773: in VariableManager get_vars() 32980 1727096595.79819: Calling all_inventory to load vars for managed_node2 32980 1727096595.79822: Calling groups_inventory to load vars for managed_node2 32980 1727096595.79824: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.79837: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.79840: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.79843: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.80224: done sending task result for task 0afff68d-5257-457d-ef33-00000000000d 32980 1727096595.80228: WORKER PROCESS EXITING 32980 1727096595.80251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.80475: done with get_vars() 32980 1727096595.80483: variable 'ansible_search_path' from source: unknown 32980 1727096595.80496: we have included files to process 32980 1727096595.80497: generating all_blocks data 32980 1727096595.80498: done generating all_blocks data 32980 1727096595.80510: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32980 1727096595.80512: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32980 1727096595.80514: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32980 1727096595.80727: in VariableManager get_vars() 32980 1727096595.80748: done with get_vars() 32980 1727096595.80889: done processing included file 32980 1727096595.80891: iterating over new_blocks loaded from include file 32980 1727096595.80893: in VariableManager get_vars() 32980 1727096595.80909: done with get_vars() 32980 1727096595.80911: filtering new block on tags 32980 1727096595.81061: done filtering new block on tags 32980 1727096595.81064: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 32980 1727096595.81086: extending task lists for all hosts with included blocks 32980 1727096595.84980: done extending task lists 32980 1727096595.84982: done processing included files 32980 1727096595.84983: results queue empty 32980 1727096595.84984: checking for any_errors_fatal 32980 1727096595.84987: done checking for any_errors_fatal 32980 1727096595.84988: checking for max_fail_percentage 32980 1727096595.84989: done checking for max_fail_percentage 32980 1727096595.84990: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.84991: done checking to see if all hosts have failed 32980 1727096595.84991: getting the remaining hosts for this loop 32980 1727096595.84993: done getting the remaining hosts for this loop 32980 1727096595.84995: getting the next task for host managed_node2 32980 1727096595.84999: done getting next task for host managed_node2 32980 1727096595.85002: ^ task is: TASK: Include the task 'get_interface_stat.yml' 32980 1727096595.85005: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.85007: getting variables 32980 1727096595.85008: in VariableManager get_vars() 32980 1727096595.85025: Calling all_inventory to load vars for managed_node2 32980 1727096595.85028: Calling groups_inventory to load vars for managed_node2 32980 1727096595.85030: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.85035: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.85038: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.85041: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.85419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.85623: done with get_vars() 32980 1727096595.85632: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 09:03:15 -0400 (0:00:00.074) 0:00:07.784 ****** 32980 1727096595.85798: entering _queue_task() for managed_node2/include_tasks 32980 1727096595.86294: worker is 1 (out of 1 available) 32980 1727096595.86305: exiting _queue_task() for managed_node2/include_tasks 32980 1727096595.86316: done queuing things up, now waiting for results queue to drain 32980 1727096595.86317: waiting for pending results... 32980 1727096595.86552: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 32980 1727096595.86678: in run() - task 0afff68d-5257-457d-ef33-00000000038b 32980 1727096595.86703: variable 'ansible_search_path' from source: unknown 32980 1727096595.86713: variable 'ansible_search_path' from source: unknown 32980 1727096595.86750: calling self._execute() 32980 1727096595.86839: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.86848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.86860: variable 'omit' from source: magic vars 32980 1727096595.87287: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.87303: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.87573: _execute() done 32980 1727096595.87577: dumping result to json 32980 1727096595.87581: done dumping result, returning 32980 1727096595.87583: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-457d-ef33-00000000038b] 32980 1727096595.87586: sending task result for task 0afff68d-5257-457d-ef33-00000000038b 32980 1727096595.87655: done sending task result for task 0afff68d-5257-457d-ef33-00000000038b 32980 1727096595.87658: WORKER PROCESS EXITING 32980 1727096595.87690: no more pending results, returning what we have 32980 1727096595.87695: in VariableManager get_vars() 32980 1727096595.87743: Calling all_inventory to load vars for managed_node2 32980 1727096595.87745: Calling groups_inventory to load vars for managed_node2 32980 1727096595.87748: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.87762: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.87765: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.87770: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.88353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.88927: done with get_vars() 32980 1727096595.88935: variable 'ansible_search_path' from source: unknown 32980 1727096595.88936: variable 'ansible_search_path' from source: unknown 32980 1727096595.88976: we have included files to process 32980 1727096595.88977: generating all_blocks data 32980 1727096595.88978: done generating all_blocks data 32980 1727096595.88980: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32980 1727096595.88981: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32980 1727096595.88983: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32980 1727096595.89693: done processing included file 32980 1727096595.89695: iterating over new_blocks loaded from include file 32980 1727096595.89697: in VariableManager get_vars() 32980 1727096595.89717: done with get_vars() 32980 1727096595.89719: filtering new block on tags 32980 1727096595.89734: done filtering new block on tags 32980 1727096595.89737: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 32980 1727096595.89742: extending task lists for all hosts with included blocks 32980 1727096595.89961: done extending task lists 32980 1727096595.89963: done processing included files 32980 1727096595.89964: results queue empty 32980 1727096595.89965: checking for any_errors_fatal 32980 1727096595.90051: done checking for any_errors_fatal 32980 1727096595.90052: checking for max_fail_percentage 32980 1727096595.90053: done checking for max_fail_percentage 32980 1727096595.90054: checking to see if all hosts have failed and the running result is not ok 32980 1727096595.90055: done checking to see if all hosts have failed 32980 1727096595.90056: getting the remaining hosts for this loop 32980 1727096595.90057: done getting the remaining hosts for this loop 32980 1727096595.90060: getting the next task for host managed_node2 32980 1727096595.90064: done getting next task for host managed_node2 32980 1727096595.90067: ^ task is: TASK: Get stat for interface {{ interface }} 32980 1727096595.90072: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096595.90074: getting variables 32980 1727096595.90075: in VariableManager get_vars() 32980 1727096595.90090: Calling all_inventory to load vars for managed_node2 32980 1727096595.90092: Calling groups_inventory to load vars for managed_node2 32980 1727096595.90094: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096595.90099: Calling all_plugins_play to load vars for managed_node2 32980 1727096595.90213: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096595.90220: Calling groups_plugins_play to load vars for managed_node2 32980 1727096595.90459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096595.90935: done with get_vars() 32980 1727096595.90944: done getting variables 32980 1727096595.91390: variable 'interface' from source: play vars TASK [Get stat for interface lsr101] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:03:15 -0400 (0:00:00.057) 0:00:07.842 ****** 32980 1727096595.91547: entering _queue_task() for managed_node2/stat 32980 1727096595.92276: worker is 1 (out of 1 available) 32980 1727096595.92287: exiting _queue_task() for managed_node2/stat 32980 1727096595.92301: done queuing things up, now waiting for results queue to drain 32980 1727096595.92303: waiting for pending results... 32980 1727096595.92725: running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr101 32980 1727096595.92859: in run() - task 0afff68d-5257-457d-ef33-0000000004a4 32980 1727096595.92873: variable 'ansible_search_path' from source: unknown 32980 1727096595.92879: variable 'ansible_search_path' from source: unknown 32980 1727096595.92954: calling self._execute() 32980 1727096595.93130: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.93134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.93137: variable 'omit' from source: magic vars 32980 1727096595.93719: variable 'ansible_distribution_major_version' from source: facts 32980 1727096595.93735: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096595.93746: variable 'omit' from source: magic vars 32980 1727096595.93802: variable 'omit' from source: magic vars 32980 1727096595.93917: variable 'interface' from source: play vars 32980 1727096595.93945: variable 'omit' from source: magic vars 32980 1727096595.93992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096595.94073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096595.94077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096595.94084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096595.94098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096595.94141: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096595.94152: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.94160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.94344: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096595.94347: Set connection var ansible_timeout to 10 32980 1727096595.94350: Set connection var ansible_shell_type to sh 32980 1727096595.94352: Set connection var ansible_connection to ssh 32980 1727096595.94354: Set connection var ansible_shell_executable to /bin/sh 32980 1727096595.94356: Set connection var ansible_pipelining to False 32980 1727096595.94359: variable 'ansible_shell_executable' from source: unknown 32980 1727096595.94361: variable 'ansible_connection' from source: unknown 32980 1727096595.94363: variable 'ansible_module_compression' from source: unknown 32980 1727096595.94365: variable 'ansible_shell_type' from source: unknown 32980 1727096595.94366: variable 'ansible_shell_executable' from source: unknown 32980 1727096595.94370: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096595.94372: variable 'ansible_pipelining' from source: unknown 32980 1727096595.94373: variable 'ansible_timeout' from source: unknown 32980 1727096595.94375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096595.94599: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096595.94630: variable 'omit' from source: magic vars 32980 1727096595.94634: starting attempt loop 32980 1727096595.94636: running the handler 32980 1727096595.94677: _low_level_execute_command(): starting 32980 1727096595.94680: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096595.95866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096595.95885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.95972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096595.95996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096595.96138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096595.97793: stdout chunk (state=3): >>>/root <<< 32980 1727096595.97985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096595.98023: stdout chunk (state=3): >>><<< 32980 1727096595.98029: stderr chunk (state=3): >>><<< 32980 1727096595.98191: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096595.98195: _low_level_execute_command(): starting 32980 1727096595.98197: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661 `" && echo ansible-tmp-1727096595.9805832-33399-24383856559661="` echo /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661 `" ) && sleep 0' 32980 1727096595.99514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096595.99696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096595.99902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096596.00088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.00270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.02109: stdout chunk (state=3): >>>ansible-tmp-1727096595.9805832-33399-24383856559661=/root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661 <<< 32980 1727096596.02282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.02311: stderr chunk (state=3): >>><<< 32980 1727096596.02328: stdout chunk (state=3): >>><<< 32980 1727096596.02364: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096595.9805832-33399-24383856559661=/root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096596.02415: variable 'ansible_module_compression' from source: unknown 32980 1727096596.02536: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32980 1727096596.02674: variable 'ansible_facts' from source: unknown 32980 1727096596.02747: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/AnsiballZ_stat.py 32980 1727096596.03096: Sending initial data 32980 1727096596.03109: Sent initial data (152 bytes) 32980 1727096596.03878: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096596.04199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.04238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.05915: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096596.05920: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096596.05957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpleisfhdc /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/AnsiballZ_stat.py <<< 32980 1727096596.05966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/AnsiballZ_stat.py" <<< 32980 1727096596.06012: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpleisfhdc" to remote "/root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/AnsiballZ_stat.py" <<< 32980 1727096596.06015: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/AnsiballZ_stat.py" <<< 32980 1727096596.07453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.07531: stderr chunk (state=3): >>><<< 32980 1727096596.07534: stdout chunk (state=3): >>><<< 32980 1727096596.07547: done transferring module to remote 32980 1727096596.07556: _low_level_execute_command(): starting 32980 1727096596.07564: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/ /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/AnsiballZ_stat.py && sleep 0' 32980 1727096596.08881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096596.08885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096596.08887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096596.08890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096596.08893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096596.08895: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096596.08897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096596.08899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096596.08901: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096596.08902: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096596.08982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096596.08990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.09056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.11257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.11262: stdout chunk (state=3): >>><<< 32980 1727096596.11266: stderr chunk (state=3): >>><<< 32980 1727096596.11298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096596.11301: _low_level_execute_command(): starting 32980 1727096596.11306: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/AnsiballZ_stat.py && sleep 0' 32980 1727096596.12287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096596.12295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096596.12475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096596.12479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096596.12482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096596.12485: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096596.12487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096596.12490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096596.12492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096596.12495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096596.12497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096596.12500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096596.12527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096596.12600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.12734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.28453: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31628, "dev": 23, "nlink": 1, "atime": 1727096594.2218866, "mtime": 1727096594.2218866, "ctime": 1727096594.2218866, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32980 1727096596.29815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096596.29847: stderr chunk (state=3): >>><<< 32980 1727096596.29851: stdout chunk (state=3): >>><<< 32980 1727096596.29871: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31628, "dev": 23, "nlink": 1, "atime": 1727096594.2218866, "mtime": 1727096594.2218866, "ctime": 1727096594.2218866, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096596.29910: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096596.29918: _low_level_execute_command(): starting 32980 1727096596.29923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096595.9805832-33399-24383856559661/ > /dev/null 2>&1 && sleep 0' 32980 1727096596.30617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096596.30621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096596.30624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096596.30665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.30713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.32560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.32604: stderr chunk (state=3): >>><<< 32980 1727096596.32607: stdout chunk (state=3): >>><<< 32980 1727096596.32626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096596.32629: handler run complete 32980 1727096596.32667: attempt loop complete, returning result 32980 1727096596.32671: _execute() done 32980 1727096596.32674: dumping result to json 32980 1727096596.32680: done dumping result, returning 32980 1727096596.32688: done running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr101 [0afff68d-5257-457d-ef33-0000000004a4] 32980 1727096596.32691: sending task result for task 0afff68d-5257-457d-ef33-0000000004a4 32980 1727096596.32797: done sending task result for task 0afff68d-5257-457d-ef33-0000000004a4 32980 1727096596.32800: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096594.2218866, "block_size": 4096, "blocks": 0, "ctime": 1727096594.2218866, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31628, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "mode": "0777", "mtime": 1727096594.2218866, "nlink": 1, "path": "/sys/class/net/lsr101", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 32980 1727096596.32891: no more pending results, returning what we have 32980 1727096596.32894: results queue empty 32980 1727096596.32895: checking for any_errors_fatal 32980 1727096596.32897: done checking for any_errors_fatal 32980 1727096596.32897: checking for max_fail_percentage 32980 1727096596.32899: done checking for max_fail_percentage 32980 1727096596.32900: checking to see if all hosts have failed and the running result is not ok 32980 1727096596.32901: done checking to see if all hosts have failed 32980 1727096596.32901: getting the remaining hosts for this loop 32980 1727096596.32903: done getting the remaining hosts for this loop 32980 1727096596.32906: getting the next task for host managed_node2 32980 1727096596.32916: done getting next task for host managed_node2 32980 1727096596.32919: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 32980 1727096596.32921: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096596.32925: getting variables 32980 1727096596.32926: in VariableManager get_vars() 32980 1727096596.33029: Calling all_inventory to load vars for managed_node2 32980 1727096596.33032: Calling groups_inventory to load vars for managed_node2 32980 1727096596.33034: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.33043: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.33045: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.33047: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.33153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.33275: done with get_vars() 32980 1727096596.33283: done getting variables 32980 1727096596.33356: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 32980 1727096596.33447: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'lsr101'] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 09:03:16 -0400 (0:00:00.419) 0:00:08.261 ****** 32980 1727096596.33471: entering _queue_task() for managed_node2/assert 32980 1727096596.33474: Creating lock for assert 32980 1727096596.33696: worker is 1 (out of 1 available) 32980 1727096596.33709: exiting _queue_task() for managed_node2/assert 32980 1727096596.33720: done queuing things up, now waiting for results queue to drain 32980 1727096596.33721: waiting for pending results... 32980 1727096596.33894: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr101' 32980 1727096596.34035: in run() - task 0afff68d-5257-457d-ef33-00000000038c 32980 1727096596.34040: variable 'ansible_search_path' from source: unknown 32980 1727096596.34042: variable 'ansible_search_path' from source: unknown 32980 1727096596.34045: calling self._execute() 32980 1727096596.34183: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.34187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.34190: variable 'omit' from source: magic vars 32980 1727096596.34482: variable 'ansible_distribution_major_version' from source: facts 32980 1727096596.34494: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096596.34499: variable 'omit' from source: magic vars 32980 1727096596.34533: variable 'omit' from source: magic vars 32980 1727096596.34629: variable 'interface' from source: play vars 32980 1727096596.34647: variable 'omit' from source: magic vars 32980 1727096596.34692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096596.34732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096596.34744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096596.34763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096596.34779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096596.34810: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096596.34814: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.34817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.34948: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096596.34956: Set connection var ansible_timeout to 10 32980 1727096596.34959: Set connection var ansible_shell_type to sh 32980 1727096596.34961: Set connection var ansible_connection to ssh 32980 1727096596.34964: Set connection var ansible_shell_executable to /bin/sh 32980 1727096596.34966: Set connection var ansible_pipelining to False 32980 1727096596.34971: variable 'ansible_shell_executable' from source: unknown 32980 1727096596.34974: variable 'ansible_connection' from source: unknown 32980 1727096596.34976: variable 'ansible_module_compression' from source: unknown 32980 1727096596.34978: variable 'ansible_shell_type' from source: unknown 32980 1727096596.34981: variable 'ansible_shell_executable' from source: unknown 32980 1727096596.34984: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.34986: variable 'ansible_pipelining' from source: unknown 32980 1727096596.34989: variable 'ansible_timeout' from source: unknown 32980 1727096596.34992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.35174: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096596.35178: variable 'omit' from source: magic vars 32980 1727096596.35181: starting attempt loop 32980 1727096596.35183: running the handler 32980 1727096596.35373: variable 'interface_stat' from source: set_fact 32980 1727096596.35378: Evaluated conditional (interface_stat.stat.exists): True 32980 1727096596.35381: handler run complete 32980 1727096596.35383: attempt loop complete, returning result 32980 1727096596.35386: _execute() done 32980 1727096596.35388: dumping result to json 32980 1727096596.35390: done dumping result, returning 32980 1727096596.35392: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr101' [0afff68d-5257-457d-ef33-00000000038c] 32980 1727096596.35394: sending task result for task 0afff68d-5257-457d-ef33-00000000038c 32980 1727096596.35451: done sending task result for task 0afff68d-5257-457d-ef33-00000000038c 32980 1727096596.35454: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096596.35505: no more pending results, returning what we have 32980 1727096596.35507: results queue empty 32980 1727096596.35508: checking for any_errors_fatal 32980 1727096596.35516: done checking for any_errors_fatal 32980 1727096596.35516: checking for max_fail_percentage 32980 1727096596.35518: done checking for max_fail_percentage 32980 1727096596.35519: checking to see if all hosts have failed and the running result is not ok 32980 1727096596.35519: done checking to see if all hosts have failed 32980 1727096596.35520: getting the remaining hosts for this loop 32980 1727096596.35521: done getting the remaining hosts for this loop 32980 1727096596.35525: getting the next task for host managed_node2 32980 1727096596.35532: done getting next task for host managed_node2 32980 1727096596.35534: ^ task is: TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 32980 1727096596.35536: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096596.35540: getting variables 32980 1727096596.35542: in VariableManager get_vars() 32980 1727096596.35582: Calling all_inventory to load vars for managed_node2 32980 1727096596.35585: Calling groups_inventory to load vars for managed_node2 32980 1727096596.35587: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.35596: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.35599: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.35601: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.35789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.36041: done with get_vars() 32980 1727096596.36053: done getting variables 32980 1727096596.36124: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure the MTU for a vlan interface without autoconnect.] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:18 Monday 23 September 2024 09:03:16 -0400 (0:00:00.026) 0:00:08.288 ****** 32980 1727096596.36151: entering _queue_task() for managed_node2/debug 32980 1727096596.36695: worker is 1 (out of 1 available) 32980 1727096596.36702: exiting _queue_task() for managed_node2/debug 32980 1727096596.36710: done queuing things up, now waiting for results queue to drain 32980 1727096596.36711: waiting for pending results... 32980 1727096596.36953: running TaskExecutor() for managed_node2/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 32980 1727096596.36958: in run() - task 0afff68d-5257-457d-ef33-00000000000e 32980 1727096596.36962: variable 'ansible_search_path' from source: unknown 32980 1727096596.36964: calling self._execute() 32980 1727096596.37009: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.37021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.37036: variable 'omit' from source: magic vars 32980 1727096596.37451: variable 'ansible_distribution_major_version' from source: facts 32980 1727096596.37474: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096596.37496: variable 'omit' from source: magic vars 32980 1727096596.37524: variable 'omit' from source: magic vars 32980 1727096596.37563: variable 'omit' from source: magic vars 32980 1727096596.37624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096596.37705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096596.37708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096596.37712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096596.37730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096596.37763: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096596.37773: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.37781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.37897: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096596.37909: Set connection var ansible_timeout to 10 32980 1727096596.37954: Set connection var ansible_shell_type to sh 32980 1727096596.37957: Set connection var ansible_connection to ssh 32980 1727096596.37960: Set connection var ansible_shell_executable to /bin/sh 32980 1727096596.37961: Set connection var ansible_pipelining to False 32980 1727096596.37986: variable 'ansible_shell_executable' from source: unknown 32980 1727096596.37995: variable 'ansible_connection' from source: unknown 32980 1727096596.38033: variable 'ansible_module_compression' from source: unknown 32980 1727096596.38036: variable 'ansible_shell_type' from source: unknown 32980 1727096596.38039: variable 'ansible_shell_executable' from source: unknown 32980 1727096596.38041: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.38042: variable 'ansible_pipelining' from source: unknown 32980 1727096596.38044: variable 'ansible_timeout' from source: unknown 32980 1727096596.38046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.38252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096596.38255: variable 'omit' from source: magic vars 32980 1727096596.38258: starting attempt loop 32980 1727096596.38260: running the handler 32980 1727096596.38296: handler run complete 32980 1727096596.38320: attempt loop complete, returning result 32980 1727096596.38328: _execute() done 32980 1727096596.38334: dumping result to json 32980 1727096596.38342: done dumping result, returning 32980 1727096596.38363: done running TaskExecutor() for managed_node2/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. [0afff68d-5257-457d-ef33-00000000000e] 32980 1727096596.38389: sending task result for task 0afff68d-5257-457d-ef33-00000000000e 32980 1727096596.38600: done sending task result for task 0afff68d-5257-457d-ef33-00000000000e 32980 1727096596.38604: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 32980 1727096596.38653: no more pending results, returning what we have 32980 1727096596.38656: results queue empty 32980 1727096596.38657: checking for any_errors_fatal 32980 1727096596.38663: done checking for any_errors_fatal 32980 1727096596.38664: checking for max_fail_percentage 32980 1727096596.38665: done checking for max_fail_percentage 32980 1727096596.38666: checking to see if all hosts have failed and the running result is not ok 32980 1727096596.38669: done checking to see if all hosts have failed 32980 1727096596.38669: getting the remaining hosts for this loop 32980 1727096596.38671: done getting the remaining hosts for this loop 32980 1727096596.38675: getting the next task for host managed_node2 32980 1727096596.38779: done getting next task for host managed_node2 32980 1727096596.38795: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32980 1727096596.38799: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096596.38814: getting variables 32980 1727096596.38816: in VariableManager get_vars() 32980 1727096596.38857: Calling all_inventory to load vars for managed_node2 32980 1727096596.38860: Calling groups_inventory to load vars for managed_node2 32980 1727096596.38862: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.38983: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.38986: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.38990: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.39186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.39416: done with get_vars() 32980 1727096596.39427: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:03:16 -0400 (0:00:00.033) 0:00:08.321 ****** 32980 1727096596.39531: entering _queue_task() for managed_node2/include_tasks 32980 1727096596.39849: worker is 1 (out of 1 available) 32980 1727096596.39861: exiting _queue_task() for managed_node2/include_tasks 32980 1727096596.39876: done queuing things up, now waiting for results queue to drain 32980 1727096596.39995: waiting for pending results... 32980 1727096596.40174: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32980 1727096596.40435: in run() - task 0afff68d-5257-457d-ef33-000000000016 32980 1727096596.40438: variable 'ansible_search_path' from source: unknown 32980 1727096596.40441: variable 'ansible_search_path' from source: unknown 32980 1727096596.40443: calling self._execute() 32980 1727096596.40483: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.40494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.40508: variable 'omit' from source: magic vars 32980 1727096596.40894: variable 'ansible_distribution_major_version' from source: facts 32980 1727096596.40912: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096596.40926: _execute() done 32980 1727096596.40933: dumping result to json 32980 1727096596.40940: done dumping result, returning 32980 1727096596.40951: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-457d-ef33-000000000016] 32980 1727096596.40979: sending task result for task 0afff68d-5257-457d-ef33-000000000016 32980 1727096596.41215: no more pending results, returning what we have 32980 1727096596.41220: in VariableManager get_vars() 32980 1727096596.41270: Calling all_inventory to load vars for managed_node2 32980 1727096596.41274: Calling groups_inventory to load vars for managed_node2 32980 1727096596.41277: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.41290: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.41293: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.41296: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.41741: done sending task result for task 0afff68d-5257-457d-ef33-000000000016 32980 1727096596.41744: WORKER PROCESS EXITING 32980 1727096596.41770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.41985: done with get_vars() 32980 1727096596.41994: variable 'ansible_search_path' from source: unknown 32980 1727096596.41995: variable 'ansible_search_path' from source: unknown 32980 1727096596.42046: we have included files to process 32980 1727096596.42047: generating all_blocks data 32980 1727096596.42049: done generating all_blocks data 32980 1727096596.42052: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32980 1727096596.42054: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32980 1727096596.42056: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32980 1727096596.42804: done processing included file 32980 1727096596.42807: iterating over new_blocks loaded from include file 32980 1727096596.42808: in VariableManager get_vars() 32980 1727096596.42834: done with get_vars() 32980 1727096596.42836: filtering new block on tags 32980 1727096596.42853: done filtering new block on tags 32980 1727096596.42856: in VariableManager get_vars() 32980 1727096596.42880: done with get_vars() 32980 1727096596.42882: filtering new block on tags 32980 1727096596.42915: done filtering new block on tags 32980 1727096596.42918: in VariableManager get_vars() 32980 1727096596.42942: done with get_vars() 32980 1727096596.42943: filtering new block on tags 32980 1727096596.42961: done filtering new block on tags 32980 1727096596.42964: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 32980 1727096596.42971: extending task lists for all hosts with included blocks 32980 1727096596.43856: done extending task lists 32980 1727096596.43858: done processing included files 32980 1727096596.43859: results queue empty 32980 1727096596.43859: checking for any_errors_fatal 32980 1727096596.43862: done checking for any_errors_fatal 32980 1727096596.43863: checking for max_fail_percentage 32980 1727096596.43864: done checking for max_fail_percentage 32980 1727096596.43865: checking to see if all hosts have failed and the running result is not ok 32980 1727096596.43866: done checking to see if all hosts have failed 32980 1727096596.43867: getting the remaining hosts for this loop 32980 1727096596.43870: done getting the remaining hosts for this loop 32980 1727096596.43873: getting the next task for host managed_node2 32980 1727096596.43885: done getting next task for host managed_node2 32980 1727096596.43889: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32980 1727096596.43892: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096596.43902: getting variables 32980 1727096596.43903: in VariableManager get_vars() 32980 1727096596.43919: Calling all_inventory to load vars for managed_node2 32980 1727096596.43923: Calling groups_inventory to load vars for managed_node2 32980 1727096596.43925: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.43930: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.43932: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.43935: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.44123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.44340: done with get_vars() 32980 1727096596.44350: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:03:16 -0400 (0:00:00.048) 0:00:08.371 ****** 32980 1727096596.44437: entering _queue_task() for managed_node2/setup 32980 1727096596.45125: worker is 1 (out of 1 available) 32980 1727096596.45137: exiting _queue_task() for managed_node2/setup 32980 1727096596.45148: done queuing things up, now waiting for results queue to drain 32980 1727096596.45149: waiting for pending results... 32980 1727096596.45475: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32980 1727096596.45826: in run() - task 0afff68d-5257-457d-ef33-0000000004bf 32980 1727096596.45993: variable 'ansible_search_path' from source: unknown 32980 1727096596.45997: variable 'ansible_search_path' from source: unknown 32980 1727096596.46001: calling self._execute() 32980 1727096596.46207: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.46325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.46330: variable 'omit' from source: magic vars 32980 1727096596.47274: variable 'ansible_distribution_major_version' from source: facts 32980 1727096596.47278: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096596.47703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096596.50693: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096596.50774: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096596.50857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096596.50909: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096596.50943: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096596.51057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096596.51097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096596.51175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096596.51197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096596.51227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096596.51294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096596.51350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096596.51370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096596.51416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096596.51468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096596.51630: variable '__network_required_facts' from source: role '' defaults 32980 1727096596.51663: variable 'ansible_facts' from source: unknown 32980 1727096596.51754: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 32980 1727096596.51773: when evaluation is False, skipping this task 32980 1727096596.51874: _execute() done 32980 1727096596.51879: dumping result to json 32980 1727096596.51882: done dumping result, returning 32980 1727096596.51885: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-457d-ef33-0000000004bf] 32980 1727096596.51887: sending task result for task 0afff68d-5257-457d-ef33-0000000004bf 32980 1727096596.51965: done sending task result for task 0afff68d-5257-457d-ef33-0000000004bf 32980 1727096596.51970: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096596.52017: no more pending results, returning what we have 32980 1727096596.52020: results queue empty 32980 1727096596.52021: checking for any_errors_fatal 32980 1727096596.52023: done checking for any_errors_fatal 32980 1727096596.52023: checking for max_fail_percentage 32980 1727096596.52025: done checking for max_fail_percentage 32980 1727096596.52026: checking to see if all hosts have failed and the running result is not ok 32980 1727096596.52027: done checking to see if all hosts have failed 32980 1727096596.52027: getting the remaining hosts for this loop 32980 1727096596.52029: done getting the remaining hosts for this loop 32980 1727096596.52033: getting the next task for host managed_node2 32980 1727096596.52044: done getting next task for host managed_node2 32980 1727096596.52048: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 32980 1727096596.52052: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096596.52064: getting variables 32980 1727096596.52066: in VariableManager get_vars() 32980 1727096596.52108: Calling all_inventory to load vars for managed_node2 32980 1727096596.52111: Calling groups_inventory to load vars for managed_node2 32980 1727096596.52113: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.52124: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.52126: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.52128: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.52442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.52606: done with get_vars() 32980 1727096596.52620: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:03:16 -0400 (0:00:00.082) 0:00:08.453 ****** 32980 1727096596.52706: entering _queue_task() for managed_node2/stat 32980 1727096596.52918: worker is 1 (out of 1 available) 32980 1727096596.52935: exiting _queue_task() for managed_node2/stat 32980 1727096596.52946: done queuing things up, now waiting for results queue to drain 32980 1727096596.52947: waiting for pending results... 32980 1727096596.53115: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 32980 1727096596.53214: in run() - task 0afff68d-5257-457d-ef33-0000000004c1 32980 1727096596.53225: variable 'ansible_search_path' from source: unknown 32980 1727096596.53229: variable 'ansible_search_path' from source: unknown 32980 1727096596.53255: calling self._execute() 32980 1727096596.53320: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.53325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.53333: variable 'omit' from source: magic vars 32980 1727096596.53605: variable 'ansible_distribution_major_version' from source: facts 32980 1727096596.53620: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096596.53730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096596.53929: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096596.53965: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096596.53995: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096596.54020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096596.54088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096596.54106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096596.54123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096596.54140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096596.54211: variable '__network_is_ostree' from source: set_fact 32980 1727096596.54217: Evaluated conditional (not __network_is_ostree is defined): False 32980 1727096596.54220: when evaluation is False, skipping this task 32980 1727096596.54222: _execute() done 32980 1727096596.54225: dumping result to json 32980 1727096596.54228: done dumping result, returning 32980 1727096596.54236: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-457d-ef33-0000000004c1] 32980 1727096596.54240: sending task result for task 0afff68d-5257-457d-ef33-0000000004c1 32980 1727096596.54321: done sending task result for task 0afff68d-5257-457d-ef33-0000000004c1 32980 1727096596.54324: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32980 1727096596.54389: no more pending results, returning what we have 32980 1727096596.54395: results queue empty 32980 1727096596.54396: checking for any_errors_fatal 32980 1727096596.54406: done checking for any_errors_fatal 32980 1727096596.54407: checking for max_fail_percentage 32980 1727096596.54411: done checking for max_fail_percentage 32980 1727096596.54412: checking to see if all hosts have failed and the running result is not ok 32980 1727096596.54413: done checking to see if all hosts have failed 32980 1727096596.54414: getting the remaining hosts for this loop 32980 1727096596.54416: done getting the remaining hosts for this loop 32980 1727096596.54420: getting the next task for host managed_node2 32980 1727096596.54431: done getting next task for host managed_node2 32980 1727096596.54435: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32980 1727096596.54440: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096596.54458: getting variables 32980 1727096596.54463: in VariableManager get_vars() 32980 1727096596.54523: Calling all_inventory to load vars for managed_node2 32980 1727096596.54529: Calling groups_inventory to load vars for managed_node2 32980 1727096596.54533: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.54549: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.54552: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.54560: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.54765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.54905: done with get_vars() 32980 1727096596.54914: done getting variables 32980 1727096596.54955: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:03:16 -0400 (0:00:00.022) 0:00:08.476 ****** 32980 1727096596.54984: entering _queue_task() for managed_node2/set_fact 32980 1727096596.55208: worker is 1 (out of 1 available) 32980 1727096596.55220: exiting _queue_task() for managed_node2/set_fact 32980 1727096596.55231: done queuing things up, now waiting for results queue to drain 32980 1727096596.55233: waiting for pending results... 32980 1727096596.55403: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32980 1727096596.55500: in run() - task 0afff68d-5257-457d-ef33-0000000004c2 32980 1727096596.55511: variable 'ansible_search_path' from source: unknown 32980 1727096596.55516: variable 'ansible_search_path' from source: unknown 32980 1727096596.55542: calling self._execute() 32980 1727096596.55609: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.55614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.55622: variable 'omit' from source: magic vars 32980 1727096596.55895: variable 'ansible_distribution_major_version' from source: facts 32980 1727096596.55906: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096596.56042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096596.56303: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096596.56377: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096596.56381: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096596.56722: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096596.56725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096596.56727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096596.56730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096596.56731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096596.56733: variable '__network_is_ostree' from source: set_fact 32980 1727096596.56735: Evaluated conditional (not __network_is_ostree is defined): False 32980 1727096596.56736: when evaluation is False, skipping this task 32980 1727096596.56738: _execute() done 32980 1727096596.56740: dumping result to json 32980 1727096596.56741: done dumping result, returning 32980 1727096596.56743: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-457d-ef33-0000000004c2] 32980 1727096596.56745: sending task result for task 0afff68d-5257-457d-ef33-0000000004c2 32980 1727096596.56808: done sending task result for task 0afff68d-5257-457d-ef33-0000000004c2 32980 1727096596.56811: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32980 1727096596.56855: no more pending results, returning what we have 32980 1727096596.56857: results queue empty 32980 1727096596.56858: checking for any_errors_fatal 32980 1727096596.56865: done checking for any_errors_fatal 32980 1727096596.56866: checking for max_fail_percentage 32980 1727096596.56869: done checking for max_fail_percentage 32980 1727096596.56870: checking to see if all hosts have failed and the running result is not ok 32980 1727096596.56871: done checking to see if all hosts have failed 32980 1727096596.56872: getting the remaining hosts for this loop 32980 1727096596.56876: done getting the remaining hosts for this loop 32980 1727096596.56879: getting the next task for host managed_node2 32980 1727096596.56888: done getting next task for host managed_node2 32980 1727096596.56892: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 32980 1727096596.56895: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096596.56908: getting variables 32980 1727096596.56909: in VariableManager get_vars() 32980 1727096596.56944: Calling all_inventory to load vars for managed_node2 32980 1727096596.56946: Calling groups_inventory to load vars for managed_node2 32980 1727096596.56948: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096596.56956: Calling all_plugins_play to load vars for managed_node2 32980 1727096596.56958: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096596.56960: Calling groups_plugins_play to load vars for managed_node2 32980 1727096596.57487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096596.57716: done with get_vars() 32980 1727096596.57726: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:03:16 -0400 (0:00:00.028) 0:00:08.504 ****** 32980 1727096596.57830: entering _queue_task() for managed_node2/service_facts 32980 1727096596.57832: Creating lock for service_facts 32980 1727096596.58142: worker is 1 (out of 1 available) 32980 1727096596.58155: exiting _queue_task() for managed_node2/service_facts 32980 1727096596.58292: done queuing things up, now waiting for results queue to drain 32980 1727096596.58294: waiting for pending results... 32980 1727096596.58588: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 32980 1727096596.58730: in run() - task 0afff68d-5257-457d-ef33-0000000004c4 32980 1727096596.58734: variable 'ansible_search_path' from source: unknown 32980 1727096596.58736: variable 'ansible_search_path' from source: unknown 32980 1727096596.58777: calling self._execute() 32980 1727096596.58898: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.58902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.58905: variable 'omit' from source: magic vars 32980 1727096596.59321: variable 'ansible_distribution_major_version' from source: facts 32980 1727096596.59339: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096596.59389: variable 'omit' from source: magic vars 32980 1727096596.59436: variable 'omit' from source: magic vars 32980 1727096596.59482: variable 'omit' from source: magic vars 32980 1727096596.59540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096596.59609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096596.59623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096596.59645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096596.59715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096596.59718: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096596.59721: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.59724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.59838: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096596.59848: Set connection var ansible_timeout to 10 32980 1727096596.59856: Set connection var ansible_shell_type to sh 32980 1727096596.59862: Set connection var ansible_connection to ssh 32980 1727096596.59879: Set connection var ansible_shell_executable to /bin/sh 32980 1727096596.59889: Set connection var ansible_pipelining to False 32980 1727096596.59913: variable 'ansible_shell_executable' from source: unknown 32980 1727096596.59976: variable 'ansible_connection' from source: unknown 32980 1727096596.59980: variable 'ansible_module_compression' from source: unknown 32980 1727096596.59982: variable 'ansible_shell_type' from source: unknown 32980 1727096596.59984: variable 'ansible_shell_executable' from source: unknown 32980 1727096596.59986: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096596.59988: variable 'ansible_pipelining' from source: unknown 32980 1727096596.59990: variable 'ansible_timeout' from source: unknown 32980 1727096596.59992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096596.60205: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096596.60222: variable 'omit' from source: magic vars 32980 1727096596.60230: starting attempt loop 32980 1727096596.60262: running the handler 32980 1727096596.60270: _low_level_execute_command(): starting 32980 1727096596.60287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096596.61328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096596.61370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.61377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.63062: stdout chunk (state=3): >>>/root <<< 32980 1727096596.63216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.63219: stdout chunk (state=3): >>><<< 32980 1727096596.63222: stderr chunk (state=3): >>><<< 32980 1727096596.63344: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096596.63347: _low_level_execute_command(): starting 32980 1727096596.63350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116 `" && echo ansible-tmp-1727096596.632493-33434-123300487904116="` echo /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116 `" ) && sleep 0' 32980 1727096596.63935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096596.63950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096596.63963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096596.63984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096596.64001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096596.64089: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096596.64142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096596.64164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096596.64187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.64320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.66272: stdout chunk (state=3): >>>ansible-tmp-1727096596.632493-33434-123300487904116=/root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116 <<< 32980 1727096596.66446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.66449: stdout chunk (state=3): >>><<< 32980 1727096596.66452: stderr chunk (state=3): >>><<< 32980 1727096596.66469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096596.632493-33434-123300487904116=/root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096596.66587: variable 'ansible_module_compression' from source: unknown 32980 1727096596.66590: ANSIBALLZ: Using lock for service_facts 32980 1727096596.66595: ANSIBALLZ: Acquiring lock 32980 1727096596.66602: ANSIBALLZ: Lock acquired: 140258571574336 32980 1727096596.66609: ANSIBALLZ: Creating module 32980 1727096596.90454: ANSIBALLZ: Writing module into payload 32980 1727096596.90849: ANSIBALLZ: Writing module 32980 1727096596.90957: ANSIBALLZ: Renaming module 32980 1727096596.90961: ANSIBALLZ: Done creating module 32980 1727096596.90964: variable 'ansible_facts' from source: unknown 32980 1727096596.91176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/AnsiballZ_service_facts.py 32980 1727096596.91495: Sending initial data 32980 1727096596.91498: Sent initial data (161 bytes) 32980 1727096596.92881: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096596.92885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096596.92887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096596.92992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.93104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.94770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096596.94819: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096596.94827: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpaumpxo_w /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/AnsiballZ_service_facts.py <<< 32980 1727096596.94878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/AnsiballZ_service_facts.py" <<< 32980 1727096596.94882: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpaumpxo_w" to remote "/root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/AnsiballZ_service_facts.py" <<< 32980 1727096596.96328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.96483: stderr chunk (state=3): >>><<< 32980 1727096596.96486: stdout chunk (state=3): >>><<< 32980 1727096596.96507: done transferring module to remote 32980 1727096596.96527: _low_level_execute_command(): starting 32980 1727096596.96530: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/ /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/AnsiballZ_service_facts.py && sleep 0' 32980 1727096596.97982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096596.97986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096596.97988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096596.99719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096596.99723: stdout chunk (state=3): >>><<< 32980 1727096596.99730: stderr chunk (state=3): >>><<< 32980 1727096596.99777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096596.99781: _low_level_execute_command(): starting 32980 1727096596.99783: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/AnsiballZ_service_facts.py && sleep 0' 32980 1727096597.00986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096597.01192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096597.01205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096597.01223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096597.01310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096598.58598: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 32980 1727096598.58623: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 32980 1727096598.58654: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 32980 1727096598.58685: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 32980 1727096598.60532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096598.60535: stdout chunk (state=3): >>><<< 32980 1727096598.60544: stderr chunk (state=3): >>><<< 32980 1727096598.60807: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096598.64829: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096598.64959: _low_level_execute_command(): starting 32980 1727096598.64962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096596.632493-33434-123300487904116/ > /dev/null 2>&1 && sleep 0' 32980 1727096598.66231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096598.66322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096598.66329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096598.66344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096598.66351: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096598.66358: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096598.66575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096598.66579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096598.66604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096598.66674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096598.68577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096598.68581: stdout chunk (state=3): >>><<< 32980 1727096598.68584: stderr chunk (state=3): >>><<< 32980 1727096598.68628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096598.68634: handler run complete 32980 1727096598.69079: variable 'ansible_facts' from source: unknown 32980 1727096598.69419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096598.70945: variable 'ansible_facts' from source: unknown 32980 1727096598.71232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096598.71751: attempt loop complete, returning result 32980 1727096598.71755: _execute() done 32980 1727096598.71758: dumping result to json 32980 1727096598.71822: done dumping result, returning 32980 1727096598.71892: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-457d-ef33-0000000004c4] 32980 1727096598.71896: sending task result for task 0afff68d-5257-457d-ef33-0000000004c4 32980 1727096598.73911: done sending task result for task 0afff68d-5257-457d-ef33-0000000004c4 32980 1727096598.73914: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096598.73980: no more pending results, returning what we have 32980 1727096598.73983: results queue empty 32980 1727096598.73984: checking for any_errors_fatal 32980 1727096598.73987: done checking for any_errors_fatal 32980 1727096598.73987: checking for max_fail_percentage 32980 1727096598.73989: done checking for max_fail_percentage 32980 1727096598.73990: checking to see if all hosts have failed and the running result is not ok 32980 1727096598.73990: done checking to see if all hosts have failed 32980 1727096598.73991: getting the remaining hosts for this loop 32980 1727096598.73992: done getting the remaining hosts for this loop 32980 1727096598.73996: getting the next task for host managed_node2 32980 1727096598.74001: done getting next task for host managed_node2 32980 1727096598.74008: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 32980 1727096598.74012: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096598.74022: getting variables 32980 1727096598.74023: in VariableManager get_vars() 32980 1727096598.74052: Calling all_inventory to load vars for managed_node2 32980 1727096598.74055: Calling groups_inventory to load vars for managed_node2 32980 1727096598.74058: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096598.74091: Calling all_plugins_play to load vars for managed_node2 32980 1727096598.74095: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096598.74098: Calling groups_plugins_play to load vars for managed_node2 32980 1727096598.74440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096598.75007: done with get_vars() 32980 1727096598.75020: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:03:18 -0400 (0:00:02.173) 0:00:10.677 ****** 32980 1727096598.75132: entering _queue_task() for managed_node2/package_facts 32980 1727096598.75134: Creating lock for package_facts 32980 1727096598.75465: worker is 1 (out of 1 available) 32980 1727096598.75578: exiting _queue_task() for managed_node2/package_facts 32980 1727096598.75711: done queuing things up, now waiting for results queue to drain 32980 1727096598.75713: waiting for pending results... 32980 1727096598.76188: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 32980 1727096598.76585: in run() - task 0afff68d-5257-457d-ef33-0000000004c5 32980 1727096598.76589: variable 'ansible_search_path' from source: unknown 32980 1727096598.76593: variable 'ansible_search_path' from source: unknown 32980 1727096598.76595: calling self._execute() 32980 1727096598.76798: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096598.76813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096598.76829: variable 'omit' from source: magic vars 32980 1727096598.77630: variable 'ansible_distribution_major_version' from source: facts 32980 1727096598.77648: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096598.77660: variable 'omit' from source: magic vars 32980 1727096598.77751: variable 'omit' from source: magic vars 32980 1727096598.77801: variable 'omit' from source: magic vars 32980 1727096598.77845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096598.77897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096598.77972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096598.77976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096598.77979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096598.78001: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096598.78013: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096598.78020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096598.78133: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096598.78142: Set connection var ansible_timeout to 10 32980 1727096598.78148: Set connection var ansible_shell_type to sh 32980 1727096598.78153: Set connection var ansible_connection to ssh 32980 1727096598.78163: Set connection var ansible_shell_executable to /bin/sh 32980 1727096598.78174: Set connection var ansible_pipelining to False 32980 1727096598.78221: variable 'ansible_shell_executable' from source: unknown 32980 1727096598.78228: variable 'ansible_connection' from source: unknown 32980 1727096598.78231: variable 'ansible_module_compression' from source: unknown 32980 1727096598.78233: variable 'ansible_shell_type' from source: unknown 32980 1727096598.78236: variable 'ansible_shell_executable' from source: unknown 32980 1727096598.78238: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096598.78329: variable 'ansible_pipelining' from source: unknown 32980 1727096598.78335: variable 'ansible_timeout' from source: unknown 32980 1727096598.78337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096598.78548: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096598.78556: variable 'omit' from source: magic vars 32980 1727096598.78561: starting attempt loop 32980 1727096598.78564: running the handler 32980 1727096598.78569: _low_level_execute_command(): starting 32980 1727096598.78571: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096598.79327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096598.79385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096598.79409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096598.79444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096598.79549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096598.81197: stdout chunk (state=3): >>>/root <<< 32980 1727096598.81341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096598.81345: stdout chunk (state=3): >>><<< 32980 1727096598.81349: stderr chunk (state=3): >>><<< 32980 1727096598.81465: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096598.81471: _low_level_execute_command(): starting 32980 1727096598.81475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725 `" && echo ansible-tmp-1727096598.8137844-33520-170001850179725="` echo /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725 `" ) && sleep 0' 32980 1727096598.82014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096598.82028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096598.82052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096598.82084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096598.82161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096598.82205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096598.82220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096598.82242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096598.82318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096598.84295: stdout chunk (state=3): >>>ansible-tmp-1727096598.8137844-33520-170001850179725=/root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725 <<< 32980 1727096598.84673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096598.84677: stdout chunk (state=3): >>><<< 32980 1727096598.84680: stderr chunk (state=3): >>><<< 32980 1727096598.84683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096598.8137844-33520-170001850179725=/root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096598.84685: variable 'ansible_module_compression' from source: unknown 32980 1727096598.84688: ANSIBALLZ: Using lock for package_facts 32980 1727096598.84690: ANSIBALLZ: Acquiring lock 32980 1727096598.84692: ANSIBALLZ: Lock acquired: 140258563816432 32980 1727096598.84693: ANSIBALLZ: Creating module 32980 1727096599.09818: ANSIBALLZ: Writing module into payload 32980 1727096599.09912: ANSIBALLZ: Writing module 32980 1727096599.09933: ANSIBALLZ: Renaming module 32980 1727096599.09939: ANSIBALLZ: Done creating module 32980 1727096599.09970: variable 'ansible_facts' from source: unknown 32980 1727096599.10094: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/AnsiballZ_package_facts.py 32980 1727096599.10201: Sending initial data 32980 1727096599.10205: Sent initial data (162 bytes) 32980 1727096599.10853: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096599.10913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096599.10951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096599.12627: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096599.12662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096599.12698: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpo5t2i3v5 /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/AnsiballZ_package_facts.py <<< 32980 1727096599.12701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/AnsiballZ_package_facts.py" <<< 32980 1727096599.12727: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpo5t2i3v5" to remote "/root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/AnsiballZ_package_facts.py" <<< 32980 1727096599.13719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096599.13758: stderr chunk (state=3): >>><<< 32980 1727096599.13761: stdout chunk (state=3): >>><<< 32980 1727096599.13779: done transferring module to remote 32980 1727096599.13788: _low_level_execute_command(): starting 32980 1727096599.13793: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/ /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/AnsiballZ_package_facts.py && sleep 0' 32980 1727096599.14228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096599.14232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096599.14257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096599.14261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096599.14264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096599.14278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096599.14334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096599.14345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096599.14349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096599.14373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096599.16143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096599.16206: stderr chunk (state=3): >>><<< 32980 1727096599.16210: stdout chunk (state=3): >>><<< 32980 1727096599.16213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096599.16215: _low_level_execute_command(): starting 32980 1727096599.16218: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/AnsiballZ_package_facts.py && sleep 0' 32980 1727096599.16692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096599.16696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096599.16698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096599.16700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096599.16702: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096599.16752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096599.16759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096599.16772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096599.16803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096599.61216: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 32980 1727096599.61235: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 32980 1727096599.61269: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 32980 1727096599.61309: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 32980 1727096599.61347: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 32980 1727096599.61355: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 32980 1727096599.61364: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 32980 1727096599.61372: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 32980 1727096599.61379: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 32980 1727096599.61399: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 32980 1727096599.61414: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 32980 1727096599.63178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096599.63207: stderr chunk (state=3): >>><<< 32980 1727096599.63212: stdout chunk (state=3): >>><<< 32980 1727096599.63249: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096599.64729: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096599.64749: _low_level_execute_command(): starting 32980 1727096599.64753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096598.8137844-33520-170001850179725/ > /dev/null 2>&1 && sleep 0' 32980 1727096599.65277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096599.65281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096599.65283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096599.65286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096599.65288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096599.65318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096599.65342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096599.65413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096599.67253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096599.67286: stderr chunk (state=3): >>><<< 32980 1727096599.67289: stdout chunk (state=3): >>><<< 32980 1727096599.67303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096599.67308: handler run complete 32980 1727096599.67813: variable 'ansible_facts' from source: unknown 32980 1727096599.68176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096599.70174: variable 'ansible_facts' from source: unknown 32980 1727096599.70619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096599.71310: attempt loop complete, returning result 32980 1727096599.71327: _execute() done 32980 1727096599.71334: dumping result to json 32980 1727096599.71537: done dumping result, returning 32980 1727096599.71554: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-457d-ef33-0000000004c5] 32980 1727096599.71772: sending task result for task 0afff68d-5257-457d-ef33-0000000004c5 32980 1727096599.78375: done sending task result for task 0afff68d-5257-457d-ef33-0000000004c5 32980 1727096599.78379: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096599.78477: no more pending results, returning what we have 32980 1727096599.78480: results queue empty 32980 1727096599.78481: checking for any_errors_fatal 32980 1727096599.78486: done checking for any_errors_fatal 32980 1727096599.78487: checking for max_fail_percentage 32980 1727096599.78489: done checking for max_fail_percentage 32980 1727096599.78490: checking to see if all hosts have failed and the running result is not ok 32980 1727096599.78491: done checking to see if all hosts have failed 32980 1727096599.78491: getting the remaining hosts for this loop 32980 1727096599.78493: done getting the remaining hosts for this loop 32980 1727096599.78496: getting the next task for host managed_node2 32980 1727096599.78503: done getting next task for host managed_node2 32980 1727096599.78506: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 32980 1727096599.78509: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096599.78518: getting variables 32980 1727096599.78519: in VariableManager get_vars() 32980 1727096599.78550: Calling all_inventory to load vars for managed_node2 32980 1727096599.78552: Calling groups_inventory to load vars for managed_node2 32980 1727096599.78555: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096599.78563: Calling all_plugins_play to load vars for managed_node2 32980 1727096599.78565: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096599.78570: Calling groups_plugins_play to load vars for managed_node2 32980 1727096599.79737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096599.81514: done with get_vars() 32980 1727096599.81665: done getting variables 32980 1727096599.81732: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:03:19 -0400 (0:00:01.066) 0:00:11.744 ****** 32980 1727096599.81777: entering _queue_task() for managed_node2/debug 32980 1727096599.82178: worker is 1 (out of 1 available) 32980 1727096599.82378: exiting _queue_task() for managed_node2/debug 32980 1727096599.82389: done queuing things up, now waiting for results queue to drain 32980 1727096599.82390: waiting for pending results... 32980 1727096599.82582: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 32980 1727096599.82587: in run() - task 0afff68d-5257-457d-ef33-000000000017 32980 1727096599.82598: variable 'ansible_search_path' from source: unknown 32980 1727096599.82604: variable 'ansible_search_path' from source: unknown 32980 1727096599.82640: calling self._execute() 32980 1727096599.82723: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096599.82736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096599.82750: variable 'omit' from source: magic vars 32980 1727096599.83092: variable 'ansible_distribution_major_version' from source: facts 32980 1727096599.83109: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096599.83119: variable 'omit' from source: magic vars 32980 1727096599.83173: variable 'omit' from source: magic vars 32980 1727096599.83270: variable 'network_provider' from source: set_fact 32980 1727096599.83293: variable 'omit' from source: magic vars 32980 1727096599.83336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096599.83376: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096599.83402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096599.83424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096599.83439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096599.83473: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096599.83483: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096599.83490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096599.83592: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096599.83772: Set connection var ansible_timeout to 10 32980 1727096599.83776: Set connection var ansible_shell_type to sh 32980 1727096599.83778: Set connection var ansible_connection to ssh 32980 1727096599.83780: Set connection var ansible_shell_executable to /bin/sh 32980 1727096599.83782: Set connection var ansible_pipelining to False 32980 1727096599.83783: variable 'ansible_shell_executable' from source: unknown 32980 1727096599.83786: variable 'ansible_connection' from source: unknown 32980 1727096599.83788: variable 'ansible_module_compression' from source: unknown 32980 1727096599.83789: variable 'ansible_shell_type' from source: unknown 32980 1727096599.83792: variable 'ansible_shell_executable' from source: unknown 32980 1727096599.83793: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096599.83795: variable 'ansible_pipelining' from source: unknown 32980 1727096599.83797: variable 'ansible_timeout' from source: unknown 32980 1727096599.83799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096599.83817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096599.83833: variable 'omit' from source: magic vars 32980 1727096599.83842: starting attempt loop 32980 1727096599.83850: running the handler 32980 1727096599.83896: handler run complete 32980 1727096599.83914: attempt loop complete, returning result 32980 1727096599.83920: _execute() done 32980 1727096599.83926: dumping result to json 32980 1727096599.83932: done dumping result, returning 32980 1727096599.83944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-457d-ef33-000000000017] 32980 1727096599.83951: sending task result for task 0afff68d-5257-457d-ef33-000000000017 ok: [managed_node2] => {} MSG: Using network provider: nm 32980 1727096599.84103: no more pending results, returning what we have 32980 1727096599.84106: results queue empty 32980 1727096599.84107: checking for any_errors_fatal 32980 1727096599.84117: done checking for any_errors_fatal 32980 1727096599.84117: checking for max_fail_percentage 32980 1727096599.84119: done checking for max_fail_percentage 32980 1727096599.84120: checking to see if all hosts have failed and the running result is not ok 32980 1727096599.84120: done checking to see if all hosts have failed 32980 1727096599.84121: getting the remaining hosts for this loop 32980 1727096599.84122: done getting the remaining hosts for this loop 32980 1727096599.84126: getting the next task for host managed_node2 32980 1727096599.84132: done getting next task for host managed_node2 32980 1727096599.84137: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32980 1727096599.84140: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096599.84156: getting variables 32980 1727096599.84157: in VariableManager get_vars() 32980 1727096599.84198: Calling all_inventory to load vars for managed_node2 32980 1727096599.84201: Calling groups_inventory to load vars for managed_node2 32980 1727096599.84203: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096599.84215: Calling all_plugins_play to load vars for managed_node2 32980 1727096599.84217: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096599.84220: Calling groups_plugins_play to load vars for managed_node2 32980 1727096599.84741: done sending task result for task 0afff68d-5257-457d-ef33-000000000017 32980 1727096599.84744: WORKER PROCESS EXITING 32980 1727096599.85558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096599.87058: done with get_vars() 32980 1727096599.87080: done getting variables 32980 1727096599.87135: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:03:19 -0400 (0:00:00.053) 0:00:11.798 ****** 32980 1727096599.87166: entering _queue_task() for managed_node2/fail 32980 1727096599.87429: worker is 1 (out of 1 available) 32980 1727096599.87441: exiting _queue_task() for managed_node2/fail 32980 1727096599.87453: done queuing things up, now waiting for results queue to drain 32980 1727096599.87455: waiting for pending results... 32980 1727096599.87883: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32980 1727096599.87888: in run() - task 0afff68d-5257-457d-ef33-000000000018 32980 1727096599.87892: variable 'ansible_search_path' from source: unknown 32980 1727096599.87894: variable 'ansible_search_path' from source: unknown 32980 1727096599.87897: calling self._execute() 32980 1727096599.87975: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096599.87987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096599.88000: variable 'omit' from source: magic vars 32980 1727096599.88363: variable 'ansible_distribution_major_version' from source: facts 32980 1727096599.88384: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096599.88512: variable 'network_state' from source: role '' defaults 32980 1727096599.88527: Evaluated conditional (network_state != {}): False 32980 1727096599.88535: when evaluation is False, skipping this task 32980 1727096599.88541: _execute() done 32980 1727096599.88548: dumping result to json 32980 1727096599.88560: done dumping result, returning 32980 1727096599.88572: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-457d-ef33-000000000018] 32980 1727096599.88583: sending task result for task 0afff68d-5257-457d-ef33-000000000018 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096599.88836: no more pending results, returning what we have 32980 1727096599.88839: results queue empty 32980 1727096599.88841: checking for any_errors_fatal 32980 1727096599.88848: done checking for any_errors_fatal 32980 1727096599.88849: checking for max_fail_percentage 32980 1727096599.88851: done checking for max_fail_percentage 32980 1727096599.88852: checking to see if all hosts have failed and the running result is not ok 32980 1727096599.88853: done checking to see if all hosts have failed 32980 1727096599.88853: getting the remaining hosts for this loop 32980 1727096599.88855: done getting the remaining hosts for this loop 32980 1727096599.88858: getting the next task for host managed_node2 32980 1727096599.88866: done getting next task for host managed_node2 32980 1727096599.88873: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32980 1727096599.88876: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096599.88891: getting variables 32980 1727096599.88893: in VariableManager get_vars() 32980 1727096599.88934: Calling all_inventory to load vars for managed_node2 32980 1727096599.88937: Calling groups_inventory to load vars for managed_node2 32980 1727096599.88939: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096599.88951: Calling all_plugins_play to load vars for managed_node2 32980 1727096599.88954: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096599.88957: Calling groups_plugins_play to load vars for managed_node2 32980 1727096599.89481: done sending task result for task 0afff68d-5257-457d-ef33-000000000018 32980 1727096599.89484: WORKER PROCESS EXITING 32980 1727096599.90313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096599.92406: done with get_vars() 32980 1727096599.92427: done getting variables 32980 1727096599.92488: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:03:19 -0400 (0:00:00.053) 0:00:11.851 ****** 32980 1727096599.92520: entering _queue_task() for managed_node2/fail 32980 1727096599.92796: worker is 1 (out of 1 available) 32980 1727096599.92808: exiting _queue_task() for managed_node2/fail 32980 1727096599.92818: done queuing things up, now waiting for results queue to drain 32980 1727096599.92820: waiting for pending results... 32980 1727096599.93071: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32980 1727096599.93194: in run() - task 0afff68d-5257-457d-ef33-000000000019 32980 1727096599.93211: variable 'ansible_search_path' from source: unknown 32980 1727096599.93218: variable 'ansible_search_path' from source: unknown 32980 1727096599.93253: calling self._execute() 32980 1727096599.93338: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096599.93349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096599.93362: variable 'omit' from source: magic vars 32980 1727096599.93713: variable 'ansible_distribution_major_version' from source: facts 32980 1727096599.93734: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096599.93857: variable 'network_state' from source: role '' defaults 32980 1727096599.93874: Evaluated conditional (network_state != {}): False 32980 1727096599.93883: when evaluation is False, skipping this task 32980 1727096599.93890: _execute() done 32980 1727096599.93896: dumping result to json 32980 1727096599.93902: done dumping result, returning 32980 1727096599.93913: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-457d-ef33-000000000019] 32980 1727096599.93921: sending task result for task 0afff68d-5257-457d-ef33-000000000019 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096599.94212: no more pending results, returning what we have 32980 1727096599.94215: results queue empty 32980 1727096599.94217: checking for any_errors_fatal 32980 1727096599.94221: done checking for any_errors_fatal 32980 1727096599.94222: checking for max_fail_percentage 32980 1727096599.94223: done checking for max_fail_percentage 32980 1727096599.94224: checking to see if all hosts have failed and the running result is not ok 32980 1727096599.94225: done checking to see if all hosts have failed 32980 1727096599.94226: getting the remaining hosts for this loop 32980 1727096599.94227: done getting the remaining hosts for this loop 32980 1727096599.94230: getting the next task for host managed_node2 32980 1727096599.94238: done getting next task for host managed_node2 32980 1727096599.94241: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32980 1727096599.94244: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096599.94258: getting variables 32980 1727096599.94260: in VariableManager get_vars() 32980 1727096599.94300: Calling all_inventory to load vars for managed_node2 32980 1727096599.94303: Calling groups_inventory to load vars for managed_node2 32980 1727096599.94312: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096599.94334: Calling all_plugins_play to load vars for managed_node2 32980 1727096599.94337: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096599.94341: Calling groups_plugins_play to load vars for managed_node2 32980 1727096599.94901: done sending task result for task 0afff68d-5257-457d-ef33-000000000019 32980 1727096599.94905: WORKER PROCESS EXITING 32980 1727096599.95681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096599.97882: done with get_vars() 32980 1727096599.97904: done getting variables 32980 1727096599.97960: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:03:19 -0400 (0:00:00.054) 0:00:11.906 ****** 32980 1727096599.97992: entering _queue_task() for managed_node2/fail 32980 1727096599.98334: worker is 1 (out of 1 available) 32980 1727096599.98347: exiting _queue_task() for managed_node2/fail 32980 1727096599.98358: done queuing things up, now waiting for results queue to drain 32980 1727096599.98359: waiting for pending results... 32980 1727096599.98627: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32980 1727096599.98758: in run() - task 0afff68d-5257-457d-ef33-00000000001a 32980 1727096599.98781: variable 'ansible_search_path' from source: unknown 32980 1727096599.98789: variable 'ansible_search_path' from source: unknown 32980 1727096599.98831: calling self._execute() 32980 1727096599.98919: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096599.98930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096599.98942: variable 'omit' from source: magic vars 32980 1727096599.99332: variable 'ansible_distribution_major_version' from source: facts 32980 1727096599.99375: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096599.99611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096600.01686: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096600.01775: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096600.01820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096600.01870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096600.01905: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096600.02073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.02077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.02079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.02096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.02116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.02582: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.02585: Evaluated conditional (ansible_distribution_major_version | int > 9): True 32980 1727096600.02631: variable 'ansible_distribution' from source: facts 32980 1727096600.02681: variable '__network_rh_distros' from source: role '' defaults 32980 1727096600.02702: Evaluated conditional (ansible_distribution in __network_rh_distros): True 32980 1727096600.03172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.03313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.03341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.03387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.03487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.03541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.03638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.03665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.03936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.03940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.03942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.03944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.03995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.04100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.04134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.04923: variable 'network_connections' from source: task vars 32980 1727096600.05130: variable 'interface' from source: play vars 32980 1727096600.05133: variable 'interface' from source: play vars 32980 1727096600.05135: variable 'vlan_interface' from source: play vars 32980 1727096600.05278: variable 'vlan_interface' from source: play vars 32980 1727096600.05292: variable 'interface' from source: play vars 32980 1727096600.05358: variable 'interface' from source: play vars 32980 1727096600.05377: variable 'network_state' from source: role '' defaults 32980 1727096600.05441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096600.05913: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096600.05953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096600.05988: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096600.06020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096600.06070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096600.06097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096600.06132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.06161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096600.06204: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 32980 1727096600.06215: when evaluation is False, skipping this task 32980 1727096600.06222: _execute() done 32980 1727096600.06227: dumping result to json 32980 1727096600.06232: done dumping result, returning 32980 1727096600.06241: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-457d-ef33-00000000001a] 32980 1727096600.06249: sending task result for task 0afff68d-5257-457d-ef33-00000000001a skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 32980 1727096600.06645: no more pending results, returning what we have 32980 1727096600.06648: results queue empty 32980 1727096600.06650: checking for any_errors_fatal 32980 1727096600.06655: done checking for any_errors_fatal 32980 1727096600.06656: checking for max_fail_percentage 32980 1727096600.06657: done checking for max_fail_percentage 32980 1727096600.06658: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.06659: done checking to see if all hosts have failed 32980 1727096600.06660: getting the remaining hosts for this loop 32980 1727096600.06662: done getting the remaining hosts for this loop 32980 1727096600.06666: getting the next task for host managed_node2 32980 1727096600.06675: done getting next task for host managed_node2 32980 1727096600.06679: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32980 1727096600.06682: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.06695: getting variables 32980 1727096600.06697: in VariableManager get_vars() 32980 1727096600.06739: Calling all_inventory to load vars for managed_node2 32980 1727096600.06742: Calling groups_inventory to load vars for managed_node2 32980 1727096600.06744: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.06755: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.06758: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.06761: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.07680: done sending task result for task 0afff68d-5257-457d-ef33-00000000001a 32980 1727096600.07684: WORKER PROCESS EXITING 32980 1727096600.08737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.11693: done with get_vars() 32980 1727096600.11716: done getting variables 32980 1727096600.11826: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:03:20 -0400 (0:00:00.138) 0:00:12.045 ****** 32980 1727096600.11854: entering _queue_task() for managed_node2/dnf 32980 1727096600.12183: worker is 1 (out of 1 available) 32980 1727096600.12203: exiting _queue_task() for managed_node2/dnf 32980 1727096600.12216: done queuing things up, now waiting for results queue to drain 32980 1727096600.12217: waiting for pending results... 32980 1727096600.12468: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32980 1727096600.12596: in run() - task 0afff68d-5257-457d-ef33-00000000001b 32980 1727096600.12616: variable 'ansible_search_path' from source: unknown 32980 1727096600.12624: variable 'ansible_search_path' from source: unknown 32980 1727096600.12665: calling self._execute() 32980 1727096600.12758: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096600.13174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096600.13178: variable 'omit' from source: magic vars 32980 1727096600.13558: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.13713: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096600.13974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096600.16183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096600.16274: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096600.16319: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096600.16596: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096600.16628: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096600.16849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.16886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.16914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.16979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.17022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.17132: variable 'ansible_distribution' from source: facts 32980 1727096600.17142: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.17162: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 32980 1727096600.17293: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096600.17414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.17597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.17623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.17876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.17880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.17882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.17903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.17931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.18012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.18373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.18378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.18381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.18384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.18387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.18390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.18607: variable 'network_connections' from source: task vars 32980 1727096600.18638: variable 'interface' from source: play vars 32980 1727096600.18714: variable 'interface' from source: play vars 32980 1727096600.18732: variable 'vlan_interface' from source: play vars 32980 1727096600.18794: variable 'vlan_interface' from source: play vars 32980 1727096600.18807: variable 'interface' from source: play vars 32980 1727096600.18866: variable 'interface' from source: play vars 32980 1727096600.18943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096600.19108: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096600.19147: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096600.19199: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096600.19232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096600.19278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096600.19314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096600.19344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.19382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096600.19444: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096600.19649: variable 'network_connections' from source: task vars 32980 1727096600.19660: variable 'interface' from source: play vars 32980 1727096600.19725: variable 'interface' from source: play vars 32980 1727096600.19741: variable 'vlan_interface' from source: play vars 32980 1727096600.19804: variable 'vlan_interface' from source: play vars 32980 1727096600.19817: variable 'interface' from source: play vars 32980 1727096600.19882: variable 'interface' from source: play vars 32980 1727096600.19922: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32980 1727096600.19930: when evaluation is False, skipping this task 32980 1727096600.19938: _execute() done 32980 1727096600.19944: dumping result to json 32980 1727096600.19951: done dumping result, returning 32980 1727096600.19961: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-457d-ef33-00000000001b] 32980 1727096600.19972: sending task result for task 0afff68d-5257-457d-ef33-00000000001b skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32980 1727096600.20133: no more pending results, returning what we have 32980 1727096600.20136: results queue empty 32980 1727096600.20137: checking for any_errors_fatal 32980 1727096600.20143: done checking for any_errors_fatal 32980 1727096600.20143: checking for max_fail_percentage 32980 1727096600.20145: done checking for max_fail_percentage 32980 1727096600.20146: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.20147: done checking to see if all hosts have failed 32980 1727096600.20147: getting the remaining hosts for this loop 32980 1727096600.20149: done getting the remaining hosts for this loop 32980 1727096600.20152: getting the next task for host managed_node2 32980 1727096600.20159: done getting next task for host managed_node2 32980 1727096600.20162: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32980 1727096600.20165: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.20286: done sending task result for task 0afff68d-5257-457d-ef33-00000000001b 32980 1727096600.20289: WORKER PROCESS EXITING 32980 1727096600.20325: getting variables 32980 1727096600.20329: in VariableManager get_vars() 32980 1727096600.20366: Calling all_inventory to load vars for managed_node2 32980 1727096600.20370: Calling groups_inventory to load vars for managed_node2 32980 1727096600.20375: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.20383: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.20385: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.20387: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.23566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.25806: done with get_vars() 32980 1727096600.25837: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32980 1727096600.25945: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:03:20 -0400 (0:00:00.141) 0:00:12.186 ****** 32980 1727096600.25991: entering _queue_task() for managed_node2/yum 32980 1727096600.25993: Creating lock for yum 32980 1727096600.26497: worker is 1 (out of 1 available) 32980 1727096600.26508: exiting _queue_task() for managed_node2/yum 32980 1727096600.26517: done queuing things up, now waiting for results queue to drain 32980 1727096600.26518: waiting for pending results... 32980 1727096600.26684: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32980 1727096600.27086: in run() - task 0afff68d-5257-457d-ef33-00000000001c 32980 1727096600.27090: variable 'ansible_search_path' from source: unknown 32980 1727096600.27094: variable 'ansible_search_path' from source: unknown 32980 1727096600.27096: calling self._execute() 32980 1727096600.27099: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096600.27101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096600.27104: variable 'omit' from source: magic vars 32980 1727096600.27361: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.27377: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096600.27609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096600.31269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096600.31361: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096600.31400: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096600.31480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096600.31523: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096600.31620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.31682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.31710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.31804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.31818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.31942: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.31968: Evaluated conditional (ansible_distribution_major_version | int < 8): False 32980 1727096600.31972: when evaluation is False, skipping this task 32980 1727096600.31996: _execute() done 32980 1727096600.31999: dumping result to json 32980 1727096600.32002: done dumping result, returning 32980 1727096600.32011: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-457d-ef33-00000000001c] 32980 1727096600.32015: sending task result for task 0afff68d-5257-457d-ef33-00000000001c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 32980 1727096600.32170: no more pending results, returning what we have 32980 1727096600.32176: results queue empty 32980 1727096600.32177: checking for any_errors_fatal 32980 1727096600.32183: done checking for any_errors_fatal 32980 1727096600.32184: checking for max_fail_percentage 32980 1727096600.32185: done checking for max_fail_percentage 32980 1727096600.32186: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.32187: done checking to see if all hosts have failed 32980 1727096600.32188: getting the remaining hosts for this loop 32980 1727096600.32190: done getting the remaining hosts for this loop 32980 1727096600.32308: getting the next task for host managed_node2 32980 1727096600.32317: done getting next task for host managed_node2 32980 1727096600.32321: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32980 1727096600.32324: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.32340: done sending task result for task 0afff68d-5257-457d-ef33-00000000001c 32980 1727096600.32344: WORKER PROCESS EXITING 32980 1727096600.32401: getting variables 32980 1727096600.32403: in VariableManager get_vars() 32980 1727096600.32450: Calling all_inventory to load vars for managed_node2 32980 1727096600.32454: Calling groups_inventory to load vars for managed_node2 32980 1727096600.32456: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.32470: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.32476: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.32480: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.35037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.36872: done with get_vars() 32980 1727096600.36904: done getting variables 32980 1727096600.36991: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:03:20 -0400 (0:00:00.110) 0:00:12.296 ****** 32980 1727096600.37025: entering _queue_task() for managed_node2/fail 32980 1727096600.37338: worker is 1 (out of 1 available) 32980 1727096600.37349: exiting _queue_task() for managed_node2/fail 32980 1727096600.37360: done queuing things up, now waiting for results queue to drain 32980 1727096600.37361: waiting for pending results... 32980 1727096600.37621: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32980 1727096600.37793: in run() - task 0afff68d-5257-457d-ef33-00000000001d 32980 1727096600.37796: variable 'ansible_search_path' from source: unknown 32980 1727096600.37799: variable 'ansible_search_path' from source: unknown 32980 1727096600.37972: calling self._execute() 32980 1727096600.37976: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096600.37980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096600.37982: variable 'omit' from source: magic vars 32980 1727096600.38285: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.38303: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096600.38426: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096600.38616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096600.40743: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096600.40825: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096600.40866: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096600.40909: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096600.40945: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096600.41030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.41147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.41151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.41154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.41161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.41211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.41241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.41277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.41320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.41340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.41390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.41417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.41446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.41495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.41516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.41698: variable 'network_connections' from source: task vars 32980 1727096600.41773: variable 'interface' from source: play vars 32980 1727096600.41788: variable 'interface' from source: play vars 32980 1727096600.41808: variable 'vlan_interface' from source: play vars 32980 1727096600.41871: variable 'vlan_interface' from source: play vars 32980 1727096600.41887: variable 'interface' from source: play vars 32980 1727096600.41953: variable 'interface' from source: play vars 32980 1727096600.42032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096600.42191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096600.42236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096600.42288: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096600.42322: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096600.42573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096600.42576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096600.42578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.42580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096600.42582: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096600.42802: variable 'network_connections' from source: task vars 32980 1727096600.42816: variable 'interface' from source: play vars 32980 1727096600.42884: variable 'interface' from source: play vars 32980 1727096600.42899: variable 'vlan_interface' from source: play vars 32980 1727096600.42969: variable 'vlan_interface' from source: play vars 32980 1727096600.42983: variable 'interface' from source: play vars 32980 1727096600.43048: variable 'interface' from source: play vars 32980 1727096600.43091: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32980 1727096600.43110: when evaluation is False, skipping this task 32980 1727096600.43118: _execute() done 32980 1727096600.43125: dumping result to json 32980 1727096600.43133: done dumping result, returning 32980 1727096600.43147: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-457d-ef33-00000000001d] 32980 1727096600.43157: sending task result for task 0afff68d-5257-457d-ef33-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32980 1727096600.43431: no more pending results, returning what we have 32980 1727096600.43435: results queue empty 32980 1727096600.43436: checking for any_errors_fatal 32980 1727096600.43441: done checking for any_errors_fatal 32980 1727096600.43442: checking for max_fail_percentage 32980 1727096600.43444: done checking for max_fail_percentage 32980 1727096600.43445: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.43446: done checking to see if all hosts have failed 32980 1727096600.43447: getting the remaining hosts for this loop 32980 1727096600.43449: done getting the remaining hosts for this loop 32980 1727096600.43454: getting the next task for host managed_node2 32980 1727096600.43462: done getting next task for host managed_node2 32980 1727096600.43466: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 32980 1727096600.43470: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.43484: getting variables 32980 1727096600.43486: in VariableManager get_vars() 32980 1727096600.43529: Calling all_inventory to load vars for managed_node2 32980 1727096600.43533: Calling groups_inventory to load vars for managed_node2 32980 1727096600.43535: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.43547: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.43550: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.43553: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.44255: done sending task result for task 0afff68d-5257-457d-ef33-00000000001d 32980 1727096600.44259: WORKER PROCESS EXITING 32980 1727096600.45387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.46591: done with get_vars() 32980 1727096600.46607: done getting variables 32980 1727096600.46653: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:03:20 -0400 (0:00:00.096) 0:00:12.393 ****** 32980 1727096600.46680: entering _queue_task() for managed_node2/package 32980 1727096600.46908: worker is 1 (out of 1 available) 32980 1727096600.46921: exiting _queue_task() for managed_node2/package 32980 1727096600.46934: done queuing things up, now waiting for results queue to drain 32980 1727096600.46935: waiting for pending results... 32980 1727096600.47106: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 32980 1727096600.47196: in run() - task 0afff68d-5257-457d-ef33-00000000001e 32980 1727096600.47206: variable 'ansible_search_path' from source: unknown 32980 1727096600.47210: variable 'ansible_search_path' from source: unknown 32980 1727096600.47238: calling self._execute() 32980 1727096600.47306: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096600.47309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096600.47318: variable 'omit' from source: magic vars 32980 1727096600.47639: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.47642: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096600.48080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096600.48198: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096600.48244: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096600.48287: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096600.48330: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096600.48833: variable 'network_packages' from source: role '' defaults 32980 1727096600.48953: variable '__network_provider_setup' from source: role '' defaults 32980 1727096600.48959: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096600.49228: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096600.49236: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096600.49374: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096600.49627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096600.59097: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096600.59154: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096600.59392: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096600.59424: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096600.59449: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096600.59516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.59827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.59845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.59892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.59906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.59948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.59972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.60168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.60210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.60263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.60732: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32980 1727096600.60840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.60863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.61095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.61131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.61144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.61231: variable 'ansible_python' from source: facts 32980 1727096600.61254: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32980 1727096600.61551: variable '__network_wpa_supplicant_required' from source: role '' defaults 32980 1727096600.61623: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32980 1727096600.61974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.61986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.61997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.62036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.62049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.62314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.62324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.62372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.62383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.62397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.62735: variable 'network_connections' from source: task vars 32980 1727096600.62770: variable 'interface' from source: play vars 32980 1727096600.62839: variable 'interface' from source: play vars 32980 1727096600.62857: variable 'vlan_interface' from source: play vars 32980 1727096600.63151: variable 'vlan_interface' from source: play vars 32980 1727096600.63173: variable 'interface' from source: play vars 32980 1727096600.63257: variable 'interface' from source: play vars 32980 1727096600.63537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096600.63573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096600.63584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.63620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096600.63650: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096600.64331: variable 'network_connections' from source: task vars 32980 1727096600.64334: variable 'interface' from source: play vars 32980 1727096600.64434: variable 'interface' from source: play vars 32980 1727096600.64445: variable 'vlan_interface' from source: play vars 32980 1727096600.64744: variable 'vlan_interface' from source: play vars 32980 1727096600.64752: variable 'interface' from source: play vars 32980 1727096600.64850: variable 'interface' from source: play vars 32980 1727096600.65113: variable '__network_packages_default_wireless' from source: role '' defaults 32980 1727096600.65194: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096600.65912: variable 'network_connections' from source: task vars 32980 1727096600.65915: variable 'interface' from source: play vars 32980 1727096600.65973: variable 'interface' from source: play vars 32980 1727096600.65977: variable 'vlan_interface' from source: play vars 32980 1727096600.66033: variable 'vlan_interface' from source: play vars 32980 1727096600.66074: variable 'interface' from source: play vars 32980 1727096600.66303: variable 'interface' from source: play vars 32980 1727096600.66328: variable '__network_packages_default_team' from source: role '' defaults 32980 1727096600.66407: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096600.67107: variable 'network_connections' from source: task vars 32980 1727096600.67110: variable 'interface' from source: play vars 32980 1727096600.67374: variable 'interface' from source: play vars 32980 1727096600.67377: variable 'vlan_interface' from source: play vars 32980 1727096600.67380: variable 'vlan_interface' from source: play vars 32980 1727096600.67382: variable 'interface' from source: play vars 32980 1727096600.67514: variable 'interface' from source: play vars 32980 1727096600.67572: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096600.67631: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096600.67637: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096600.67902: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096600.68313: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32980 1727096600.69189: variable 'network_connections' from source: task vars 32980 1727096600.69193: variable 'interface' from source: play vars 32980 1727096600.69245: variable 'interface' from source: play vars 32980 1727096600.69253: variable 'vlan_interface' from source: play vars 32980 1727096600.69572: variable 'vlan_interface' from source: play vars 32980 1727096600.69575: variable 'interface' from source: play vars 32980 1727096600.69631: variable 'interface' from source: play vars 32980 1727096600.69648: variable 'ansible_distribution' from source: facts 32980 1727096600.69656: variable '__network_rh_distros' from source: role '' defaults 32980 1727096600.69666: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.69703: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32980 1727096600.69912: variable 'ansible_distribution' from source: facts 32980 1727096600.69923: variable '__network_rh_distros' from source: role '' defaults 32980 1727096600.69934: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.69951: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32980 1727096600.70137: variable 'ansible_distribution' from source: facts 32980 1727096600.70146: variable '__network_rh_distros' from source: role '' defaults 32980 1727096600.70155: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.70195: variable 'network_provider' from source: set_fact 32980 1727096600.70329: variable 'ansible_facts' from source: unknown 32980 1727096600.70947: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 32980 1727096600.70955: when evaluation is False, skipping this task 32980 1727096600.70961: _execute() done 32980 1727096600.70970: dumping result to json 32980 1727096600.70990: done dumping result, returning 32980 1727096600.71002: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-457d-ef33-00000000001e] 32980 1727096600.71010: sending task result for task 0afff68d-5257-457d-ef33-00000000001e 32980 1727096600.71230: done sending task result for task 0afff68d-5257-457d-ef33-00000000001e 32980 1727096600.71234: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 32980 1727096600.71285: no more pending results, returning what we have 32980 1727096600.71289: results queue empty 32980 1727096600.71289: checking for any_errors_fatal 32980 1727096600.71295: done checking for any_errors_fatal 32980 1727096600.71295: checking for max_fail_percentage 32980 1727096600.71297: done checking for max_fail_percentage 32980 1727096600.71302: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.71303: done checking to see if all hosts have failed 32980 1727096600.71304: getting the remaining hosts for this loop 32980 1727096600.71305: done getting the remaining hosts for this loop 32980 1727096600.71309: getting the next task for host managed_node2 32980 1727096600.71316: done getting next task for host managed_node2 32980 1727096600.71325: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32980 1727096600.71328: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.71341: getting variables 32980 1727096600.71411: in VariableManager get_vars() 32980 1727096600.71461: Calling all_inventory to load vars for managed_node2 32980 1727096600.71464: Calling groups_inventory to load vars for managed_node2 32980 1727096600.71527: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.71539: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.71542: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.71545: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.77606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.81403: done with get_vars() 32980 1727096600.81433: done getting variables 32980 1727096600.81636: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:03:20 -0400 (0:00:00.349) 0:00:12.743 ****** 32980 1727096600.81671: entering _queue_task() for managed_node2/package 32980 1727096600.82358: worker is 1 (out of 1 available) 32980 1727096600.82576: exiting _queue_task() for managed_node2/package 32980 1727096600.82588: done queuing things up, now waiting for results queue to drain 32980 1727096600.82589: waiting for pending results... 32980 1727096600.82816: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32980 1727096600.82946: in run() - task 0afff68d-5257-457d-ef33-00000000001f 32980 1727096600.82950: variable 'ansible_search_path' from source: unknown 32980 1727096600.82954: variable 'ansible_search_path' from source: unknown 32980 1727096600.82962: calling self._execute() 32980 1727096600.83095: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096600.83127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096600.83143: variable 'omit' from source: magic vars 32980 1727096600.84131: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.84139: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096600.84271: variable 'network_state' from source: role '' defaults 32980 1727096600.84310: Evaluated conditional (network_state != {}): False 32980 1727096600.84319: when evaluation is False, skipping this task 32980 1727096600.84328: _execute() done 32980 1727096600.84336: dumping result to json 32980 1727096600.84378: done dumping result, returning 32980 1727096600.84393: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-457d-ef33-00000000001f] 32980 1727096600.84406: sending task result for task 0afff68d-5257-457d-ef33-00000000001f 32980 1727096600.84582: done sending task result for task 0afff68d-5257-457d-ef33-00000000001f 32980 1727096600.84586: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096600.84631: no more pending results, returning what we have 32980 1727096600.84634: results queue empty 32980 1727096600.84635: checking for any_errors_fatal 32980 1727096600.84642: done checking for any_errors_fatal 32980 1727096600.84643: checking for max_fail_percentage 32980 1727096600.84645: done checking for max_fail_percentage 32980 1727096600.84645: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.84646: done checking to see if all hosts have failed 32980 1727096600.84647: getting the remaining hosts for this loop 32980 1727096600.84648: done getting the remaining hosts for this loop 32980 1727096600.84652: getting the next task for host managed_node2 32980 1727096600.84661: done getting next task for host managed_node2 32980 1727096600.84664: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32980 1727096600.84879: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.84895: getting variables 32980 1727096600.84897: in VariableManager get_vars() 32980 1727096600.84935: Calling all_inventory to load vars for managed_node2 32980 1727096600.84938: Calling groups_inventory to load vars for managed_node2 32980 1727096600.84941: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.84950: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.84952: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.84955: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.87305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.88283: done with get_vars() 32980 1727096600.88299: done getting variables 32980 1727096600.88342: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:03:20 -0400 (0:00:00.066) 0:00:12.810 ****** 32980 1727096600.88369: entering _queue_task() for managed_node2/package 32980 1727096600.88629: worker is 1 (out of 1 available) 32980 1727096600.88642: exiting _queue_task() for managed_node2/package 32980 1727096600.88747: done queuing things up, now waiting for results queue to drain 32980 1727096600.88749: waiting for pending results... 32980 1727096600.88992: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32980 1727096600.89097: in run() - task 0afff68d-5257-457d-ef33-000000000020 32980 1727096600.89101: variable 'ansible_search_path' from source: unknown 32980 1727096600.89104: variable 'ansible_search_path' from source: unknown 32980 1727096600.89137: calling self._execute() 32980 1727096600.89248: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096600.89260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096600.89281: variable 'omit' from source: magic vars 32980 1727096600.89859: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.89862: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096600.89990: variable 'network_state' from source: role '' defaults 32980 1727096600.90007: Evaluated conditional (network_state != {}): False 32980 1727096600.90015: when evaluation is False, skipping this task 32980 1727096600.90023: _execute() done 32980 1727096600.90036: dumping result to json 32980 1727096600.90079: done dumping result, returning 32980 1727096600.90084: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-457d-ef33-000000000020] 32980 1727096600.90086: sending task result for task 0afff68d-5257-457d-ef33-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096600.90265: no more pending results, returning what we have 32980 1727096600.90270: results queue empty 32980 1727096600.90271: checking for any_errors_fatal 32980 1727096600.90278: done checking for any_errors_fatal 32980 1727096600.90278: checking for max_fail_percentage 32980 1727096600.90280: done checking for max_fail_percentage 32980 1727096600.90281: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.90282: done checking to see if all hosts have failed 32980 1727096600.90282: getting the remaining hosts for this loop 32980 1727096600.90284: done getting the remaining hosts for this loop 32980 1727096600.90287: getting the next task for host managed_node2 32980 1727096600.90296: done getting next task for host managed_node2 32980 1727096600.90300: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32980 1727096600.90303: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.90321: getting variables 32980 1727096600.90323: in VariableManager get_vars() 32980 1727096600.90361: Calling all_inventory to load vars for managed_node2 32980 1727096600.90364: Calling groups_inventory to load vars for managed_node2 32980 1727096600.90366: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.90380: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.90382: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.90386: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.90901: done sending task result for task 0afff68d-5257-457d-ef33-000000000020 32980 1727096600.90904: WORKER PROCESS EXITING 32980 1727096600.91548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.92435: done with get_vars() 32980 1727096600.92449: done getting variables 32980 1727096600.92526: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:03:20 -0400 (0:00:00.041) 0:00:12.852 ****** 32980 1727096600.92547: entering _queue_task() for managed_node2/service 32980 1727096600.92548: Creating lock for service 32980 1727096600.92842: worker is 1 (out of 1 available) 32980 1727096600.92853: exiting _queue_task() for managed_node2/service 32980 1727096600.92864: done queuing things up, now waiting for results queue to drain 32980 1727096600.92866: waiting for pending results... 32980 1727096600.93139: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32980 1727096600.93282: in run() - task 0afff68d-5257-457d-ef33-000000000021 32980 1727096600.93313: variable 'ansible_search_path' from source: unknown 32980 1727096600.93318: variable 'ansible_search_path' from source: unknown 32980 1727096600.93352: calling self._execute() 32980 1727096600.93446: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096600.93451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096600.93521: variable 'omit' from source: magic vars 32980 1727096600.93846: variable 'ansible_distribution_major_version' from source: facts 32980 1727096600.93851: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096600.93961: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096600.94178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096600.95749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096600.96044: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096600.96073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096600.96100: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096600.96120: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096600.96184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.96207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.96294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.96297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.96300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.96437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.96440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.96443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.96445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.96448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.96673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096600.96676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096600.96679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.96682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096600.96684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096600.96739: variable 'network_connections' from source: task vars 32980 1727096600.96750: variable 'interface' from source: play vars 32980 1727096600.96819: variable 'interface' from source: play vars 32980 1727096600.96831: variable 'vlan_interface' from source: play vars 32980 1727096600.96894: variable 'vlan_interface' from source: play vars 32980 1727096600.96900: variable 'interface' from source: play vars 32980 1727096600.96958: variable 'interface' from source: play vars 32980 1727096600.97027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096600.97225: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096600.97239: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096600.97251: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096600.97298: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096600.97351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096600.97363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096600.97396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096600.97415: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096600.97461: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096600.97613: variable 'network_connections' from source: task vars 32980 1727096600.97616: variable 'interface' from source: play vars 32980 1727096600.97657: variable 'interface' from source: play vars 32980 1727096600.97674: variable 'vlan_interface' from source: play vars 32980 1727096600.97713: variable 'vlan_interface' from source: play vars 32980 1727096600.97719: variable 'interface' from source: play vars 32980 1727096600.97759: variable 'interface' from source: play vars 32980 1727096600.97790: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32980 1727096600.97800: when evaluation is False, skipping this task 32980 1727096600.97803: _execute() done 32980 1727096600.97806: dumping result to json 32980 1727096600.97808: done dumping result, returning 32980 1727096600.97810: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-457d-ef33-000000000021] 32980 1727096600.97812: sending task result for task 0afff68d-5257-457d-ef33-000000000021 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32980 1727096600.97937: no more pending results, returning what we have 32980 1727096600.97940: results queue empty 32980 1727096600.97941: checking for any_errors_fatal 32980 1727096600.97947: done checking for any_errors_fatal 32980 1727096600.97947: checking for max_fail_percentage 32980 1727096600.97949: done checking for max_fail_percentage 32980 1727096600.97950: checking to see if all hosts have failed and the running result is not ok 32980 1727096600.97950: done checking to see if all hosts have failed 32980 1727096600.97951: getting the remaining hosts for this loop 32980 1727096600.97953: done getting the remaining hosts for this loop 32980 1727096600.97956: getting the next task for host managed_node2 32980 1727096600.97964: done getting next task for host managed_node2 32980 1727096600.97969: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32980 1727096600.97972: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096600.97985: getting variables 32980 1727096600.97986: in VariableManager get_vars() 32980 1727096600.98025: Calling all_inventory to load vars for managed_node2 32980 1727096600.98028: Calling groups_inventory to load vars for managed_node2 32980 1727096600.98030: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096600.98039: Calling all_plugins_play to load vars for managed_node2 32980 1727096600.98041: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096600.98044: Calling groups_plugins_play to load vars for managed_node2 32980 1727096600.98584: done sending task result for task 0afff68d-5257-457d-ef33-000000000021 32980 1727096600.98588: WORKER PROCESS EXITING 32980 1727096600.98989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096600.99844: done with get_vars() 32980 1727096600.99860: done getting variables 32980 1727096600.99904: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:03:20 -0400 (0:00:00.073) 0:00:12.925 ****** 32980 1727096600.99929: entering _queue_task() for managed_node2/service 32980 1727096601.00127: worker is 1 (out of 1 available) 32980 1727096601.00140: exiting _queue_task() for managed_node2/service 32980 1727096601.00152: done queuing things up, now waiting for results queue to drain 32980 1727096601.00153: waiting for pending results... 32980 1727096601.00323: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32980 1727096601.00406: in run() - task 0afff68d-5257-457d-ef33-000000000022 32980 1727096601.00416: variable 'ansible_search_path' from source: unknown 32980 1727096601.00420: variable 'ansible_search_path' from source: unknown 32980 1727096601.00446: calling self._execute() 32980 1727096601.00516: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096601.00520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096601.00528: variable 'omit' from source: magic vars 32980 1727096601.00797: variable 'ansible_distribution_major_version' from source: facts 32980 1727096601.00806: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096601.00914: variable 'network_provider' from source: set_fact 32980 1727096601.00917: variable 'network_state' from source: role '' defaults 32980 1727096601.00928: Evaluated conditional (network_provider == "nm" or network_state != {}): True 32980 1727096601.00937: variable 'omit' from source: magic vars 32980 1727096601.00971: variable 'omit' from source: magic vars 32980 1727096601.00991: variable 'network_service_name' from source: role '' defaults 32980 1727096601.01044: variable 'network_service_name' from source: role '' defaults 32980 1727096601.01116: variable '__network_provider_setup' from source: role '' defaults 32980 1727096601.01120: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096601.01170: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096601.01178: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096601.01222: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096601.01366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096601.02775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096601.02828: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096601.02855: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096601.02885: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096601.02907: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096601.02965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096601.02992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096601.03011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096601.03037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096601.03047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096601.03083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096601.03100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096601.03120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096601.03144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096601.03155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096601.03305: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32980 1727096601.03384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096601.03400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096601.03416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096601.03445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096601.03456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096601.03518: variable 'ansible_python' from source: facts 32980 1727096601.03535: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32980 1727096601.03595: variable '__network_wpa_supplicant_required' from source: role '' defaults 32980 1727096601.03645: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32980 1727096601.03730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096601.03748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096601.03769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096601.03797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096601.03807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096601.03839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096601.03860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096601.03884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096601.03908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096601.03919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096601.04012: variable 'network_connections' from source: task vars 32980 1727096601.04018: variable 'interface' from source: play vars 32980 1727096601.04070: variable 'interface' from source: play vars 32980 1727096601.04086: variable 'vlan_interface' from source: play vars 32980 1727096601.04137: variable 'vlan_interface' from source: play vars 32980 1727096601.04145: variable 'interface' from source: play vars 32980 1727096601.04204: variable 'interface' from source: play vars 32980 1727096601.04272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096601.04406: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096601.04443: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096601.04474: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096601.04506: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096601.04552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096601.04573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096601.04598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096601.04619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096601.04658: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096601.04836: variable 'network_connections' from source: task vars 32980 1727096601.04842: variable 'interface' from source: play vars 32980 1727096601.04900: variable 'interface' from source: play vars 32980 1727096601.04910: variable 'vlan_interface' from source: play vars 32980 1727096601.04964: variable 'vlan_interface' from source: play vars 32980 1727096601.04970: variable 'interface' from source: play vars 32980 1727096601.05021: variable 'interface' from source: play vars 32980 1727096601.05056: variable '__network_packages_default_wireless' from source: role '' defaults 32980 1727096601.05116: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096601.05312: variable 'network_connections' from source: task vars 32980 1727096601.05315: variable 'interface' from source: play vars 32980 1727096601.05363: variable 'interface' from source: play vars 32980 1727096601.05371: variable 'vlan_interface' from source: play vars 32980 1727096601.05429: variable 'vlan_interface' from source: play vars 32980 1727096601.05434: variable 'interface' from source: play vars 32980 1727096601.05487: variable 'interface' from source: play vars 32980 1727096601.05507: variable '__network_packages_default_team' from source: role '' defaults 32980 1727096601.05562: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096601.05749: variable 'network_connections' from source: task vars 32980 1727096601.05752: variable 'interface' from source: play vars 32980 1727096601.05805: variable 'interface' from source: play vars 32980 1727096601.05812: variable 'vlan_interface' from source: play vars 32980 1727096601.05864: variable 'vlan_interface' from source: play vars 32980 1727096601.05870: variable 'interface' from source: play vars 32980 1727096601.05920: variable 'interface' from source: play vars 32980 1727096601.05969: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096601.06013: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096601.06017: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096601.06063: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096601.06199: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32980 1727096601.06518: variable 'network_connections' from source: task vars 32980 1727096601.06522: variable 'interface' from source: play vars 32980 1727096601.06564: variable 'interface' from source: play vars 32980 1727096601.06573: variable 'vlan_interface' from source: play vars 32980 1727096601.06620: variable 'vlan_interface' from source: play vars 32980 1727096601.06625: variable 'interface' from source: play vars 32980 1727096601.06669: variable 'interface' from source: play vars 32980 1727096601.06679: variable 'ansible_distribution' from source: facts 32980 1727096601.06682: variable '__network_rh_distros' from source: role '' defaults 32980 1727096601.06687: variable 'ansible_distribution_major_version' from source: facts 32980 1727096601.06710: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32980 1727096601.06823: variable 'ansible_distribution' from source: facts 32980 1727096601.06827: variable '__network_rh_distros' from source: role '' defaults 32980 1727096601.06829: variable 'ansible_distribution_major_version' from source: facts 32980 1727096601.06842: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32980 1727096601.06954: variable 'ansible_distribution' from source: facts 32980 1727096601.06958: variable '__network_rh_distros' from source: role '' defaults 32980 1727096601.06960: variable 'ansible_distribution_major_version' from source: facts 32980 1727096601.06990: variable 'network_provider' from source: set_fact 32980 1727096601.07006: variable 'omit' from source: magic vars 32980 1727096601.07029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096601.07049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096601.07064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096601.07081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096601.07089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096601.07110: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096601.07114: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096601.07116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096601.07188: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096601.07192: Set connection var ansible_timeout to 10 32980 1727096601.07199: Set connection var ansible_shell_type to sh 32980 1727096601.07201: Set connection var ansible_connection to ssh 32980 1727096601.07203: Set connection var ansible_shell_executable to /bin/sh 32980 1727096601.07208: Set connection var ansible_pipelining to False 32980 1727096601.07225: variable 'ansible_shell_executable' from source: unknown 32980 1727096601.07227: variable 'ansible_connection' from source: unknown 32980 1727096601.07230: variable 'ansible_module_compression' from source: unknown 32980 1727096601.07232: variable 'ansible_shell_type' from source: unknown 32980 1727096601.07234: variable 'ansible_shell_executable' from source: unknown 32980 1727096601.07236: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096601.07246: variable 'ansible_pipelining' from source: unknown 32980 1727096601.07248: variable 'ansible_timeout' from source: unknown 32980 1727096601.07250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096601.07319: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096601.07327: variable 'omit' from source: magic vars 32980 1727096601.07333: starting attempt loop 32980 1727096601.07335: running the handler 32980 1727096601.07395: variable 'ansible_facts' from source: unknown 32980 1727096601.07862: _low_level_execute_command(): starting 32980 1727096601.07869: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096601.08360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096601.08393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096601.08396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.08399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.08451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096601.08454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096601.08461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096601.08503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096601.10155: stdout chunk (state=3): >>>/root <<< 32980 1727096601.10283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096601.10286: stdout chunk (state=3): >>><<< 32980 1727096601.10294: stderr chunk (state=3): >>><<< 32980 1727096601.10313: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096601.10322: _low_level_execute_command(): starting 32980 1727096601.10327: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952 `" && echo ansible-tmp-1727096601.1031199-33619-97089496100952="` echo /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952 `" ) && sleep 0' 32980 1727096601.10743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096601.10786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096601.10789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096601.10791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.10793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096601.10795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096601.10798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.10839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096601.10846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096601.10882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096601.12792: stdout chunk (state=3): >>>ansible-tmp-1727096601.1031199-33619-97089496100952=/root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952 <<< 32980 1727096601.12900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096601.12931: stderr chunk (state=3): >>><<< 32980 1727096601.12934: stdout chunk (state=3): >>><<< 32980 1727096601.12949: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096601.1031199-33619-97089496100952=/root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096601.12981: variable 'ansible_module_compression' from source: unknown 32980 1727096601.13028: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 32980 1727096601.13033: ANSIBALLZ: Acquiring lock 32980 1727096601.13036: ANSIBALLZ: Lock acquired: 140258569802416 32980 1727096601.13041: ANSIBALLZ: Creating module 32980 1727096601.47414: ANSIBALLZ: Writing module into payload 32980 1727096601.47524: ANSIBALLZ: Writing module 32980 1727096601.47546: ANSIBALLZ: Renaming module 32980 1727096601.47552: ANSIBALLZ: Done creating module 32980 1727096601.47572: variable 'ansible_facts' from source: unknown 32980 1727096601.47697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/AnsiballZ_systemd.py 32980 1727096601.47803: Sending initial data 32980 1727096601.47806: Sent initial data (155 bytes) 32980 1727096601.48241: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096601.48244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.48246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096601.48249: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096601.48252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.48306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096601.48313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096601.48349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096601.50009: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32980 1727096601.50013: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096601.50030: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096601.50065: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpoo3lcby9 /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/AnsiballZ_systemd.py <<< 32980 1727096601.50072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/AnsiballZ_systemd.py" <<< 32980 1727096601.50098: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpoo3lcby9" to remote "/root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/AnsiballZ_systemd.py" <<< 32980 1727096601.50105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/AnsiballZ_systemd.py" <<< 32980 1727096601.51080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096601.51119: stderr chunk (state=3): >>><<< 32980 1727096601.51122: stdout chunk (state=3): >>><<< 32980 1727096601.51148: done transferring module to remote 32980 1727096601.51156: _low_level_execute_command(): starting 32980 1727096601.51161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/ /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/AnsiballZ_systemd.py && sleep 0' 32980 1727096601.51564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096601.51598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096601.51601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096601.51603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096601.51605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096601.51607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.51655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096601.51658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096601.51694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096601.53514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096601.53517: stdout chunk (state=3): >>><<< 32980 1727096601.53520: stderr chunk (state=3): >>><<< 32980 1727096601.53531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096601.53535: _low_level_execute_command(): starting 32980 1727096601.53537: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/AnsiballZ_systemd.py && sleep 0' 32980 1727096601.53971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096601.53987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.53994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096601.54046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096601.54059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096601.54113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096601.83521: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4685824", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302739968", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1939017000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 32980 1727096601.85414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096601.85450: stderr chunk (state=3): >>><<< 32980 1727096601.85454: stdout chunk (state=3): >>><<< 32980 1727096601.85504: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4685824", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302739968", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1939017000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096601.85895: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096601.85916: _low_level_execute_command(): starting 32980 1727096601.85925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096601.1031199-33619-97089496100952/ > /dev/null 2>&1 && sleep 0' 32980 1727096601.86632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096601.86676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096601.86781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096601.86809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096601.86831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096601.86849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096601.86918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096601.88897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096601.88901: stdout chunk (state=3): >>><<< 32980 1727096601.88904: stderr chunk (state=3): >>><<< 32980 1727096601.89113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096601.89116: handler run complete 32980 1727096601.89118: attempt loop complete, returning result 32980 1727096601.89120: _execute() done 32980 1727096601.89121: dumping result to json 32980 1727096601.89123: done dumping result, returning 32980 1727096601.89124: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-457d-ef33-000000000022] 32980 1727096601.89126: sending task result for task 0afff68d-5257-457d-ef33-000000000022 32980 1727096601.89717: done sending task result for task 0afff68d-5257-457d-ef33-000000000022 32980 1727096601.89720: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096601.89776: no more pending results, returning what we have 32980 1727096601.89780: results queue empty 32980 1727096601.89781: checking for any_errors_fatal 32980 1727096601.89787: done checking for any_errors_fatal 32980 1727096601.89788: checking for max_fail_percentage 32980 1727096601.89790: done checking for max_fail_percentage 32980 1727096601.89791: checking to see if all hosts have failed and the running result is not ok 32980 1727096601.89792: done checking to see if all hosts have failed 32980 1727096601.89792: getting the remaining hosts for this loop 32980 1727096601.89794: done getting the remaining hosts for this loop 32980 1727096601.89797: getting the next task for host managed_node2 32980 1727096601.89805: done getting next task for host managed_node2 32980 1727096601.89809: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32980 1727096601.89811: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096601.89822: getting variables 32980 1727096601.89823: in VariableManager get_vars() 32980 1727096601.89863: Calling all_inventory to load vars for managed_node2 32980 1727096601.89866: Calling groups_inventory to load vars for managed_node2 32980 1727096601.89870: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096601.89883: Calling all_plugins_play to load vars for managed_node2 32980 1727096601.89885: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096601.89888: Calling groups_plugins_play to load vars for managed_node2 32980 1727096601.93437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096601.96590: done with get_vars() 32980 1727096601.96613: done getting variables 32980 1727096601.96813: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:03:21 -0400 (0:00:00.969) 0:00:13.895 ****** 32980 1727096601.96847: entering _queue_task() for managed_node2/service 32980 1727096601.97602: worker is 1 (out of 1 available) 32980 1727096601.97613: exiting _queue_task() for managed_node2/service 32980 1727096601.97824: done queuing things up, now waiting for results queue to drain 32980 1727096601.97825: waiting for pending results... 32980 1727096601.98487: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32980 1727096601.98593: in run() - task 0afff68d-5257-457d-ef33-000000000023 32980 1727096601.98606: variable 'ansible_search_path' from source: unknown 32980 1727096601.98609: variable 'ansible_search_path' from source: unknown 32980 1727096601.98642: calling self._execute() 32980 1727096601.98866: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096601.98872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096601.98884: variable 'omit' from source: magic vars 32980 1727096601.99699: variable 'ansible_distribution_major_version' from source: facts 32980 1727096601.99775: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096601.99930: variable 'network_provider' from source: set_fact 32980 1727096601.99934: Evaluated conditional (network_provider == "nm"): True 32980 1727096602.00024: variable '__network_wpa_supplicant_required' from source: role '' defaults 32980 1727096602.00332: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32980 1727096602.00893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096602.03369: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096602.03486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096602.03558: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096602.03602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096602.03648: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096602.03764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096602.03805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096602.03836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096602.03920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096602.03947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096602.04017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096602.04072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096602.04118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096602.04212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096602.04226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096602.04396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096602.04401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096602.04404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096602.04461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096602.04487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096602.04688: variable 'network_connections' from source: task vars 32980 1727096602.04725: variable 'interface' from source: play vars 32980 1727096602.04851: variable 'interface' from source: play vars 32980 1727096602.04943: variable 'vlan_interface' from source: play vars 32980 1727096602.05182: variable 'vlan_interface' from source: play vars 32980 1727096602.05185: variable 'interface' from source: play vars 32980 1727096602.05188: variable 'interface' from source: play vars 32980 1727096602.05250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096602.05415: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096602.05450: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096602.05644: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096602.05679: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096602.05721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096602.05736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096602.05761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096602.05788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096602.05835: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096602.06094: variable 'network_connections' from source: task vars 32980 1727096602.06097: variable 'interface' from source: play vars 32980 1727096602.06160: variable 'interface' from source: play vars 32980 1727096602.06163: variable 'vlan_interface' from source: play vars 32980 1727096602.06222: variable 'vlan_interface' from source: play vars 32980 1727096602.06228: variable 'interface' from source: play vars 32980 1727096602.06286: variable 'interface' from source: play vars 32980 1727096602.06478: Evaluated conditional (__network_wpa_supplicant_required): False 32980 1727096602.06482: when evaluation is False, skipping this task 32980 1727096602.06484: _execute() done 32980 1727096602.06486: dumping result to json 32980 1727096602.06488: done dumping result, returning 32980 1727096602.06490: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-457d-ef33-000000000023] 32980 1727096602.06491: sending task result for task 0afff68d-5257-457d-ef33-000000000023 32980 1727096602.06551: done sending task result for task 0afff68d-5257-457d-ef33-000000000023 32980 1727096602.06554: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 32980 1727096602.06604: no more pending results, returning what we have 32980 1727096602.06607: results queue empty 32980 1727096602.06608: checking for any_errors_fatal 32980 1727096602.06622: done checking for any_errors_fatal 32980 1727096602.06622: checking for max_fail_percentage 32980 1727096602.06624: done checking for max_fail_percentage 32980 1727096602.06625: checking to see if all hosts have failed and the running result is not ok 32980 1727096602.06625: done checking to see if all hosts have failed 32980 1727096602.06626: getting the remaining hosts for this loop 32980 1727096602.06627: done getting the remaining hosts for this loop 32980 1727096602.06631: getting the next task for host managed_node2 32980 1727096602.06637: done getting next task for host managed_node2 32980 1727096602.06640: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 32980 1727096602.06643: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096602.06657: getting variables 32980 1727096602.06658: in VariableManager get_vars() 32980 1727096602.06700: Calling all_inventory to load vars for managed_node2 32980 1727096602.06703: Calling groups_inventory to load vars for managed_node2 32980 1727096602.06705: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096602.06714: Calling all_plugins_play to load vars for managed_node2 32980 1727096602.06717: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096602.06719: Calling groups_plugins_play to load vars for managed_node2 32980 1727096602.08478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096602.10269: done with get_vars() 32980 1727096602.10423: done getting variables 32980 1727096602.10547: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:03:22 -0400 (0:00:00.138) 0:00:14.033 ****** 32980 1727096602.10654: entering _queue_task() for managed_node2/service 32980 1727096602.11396: worker is 1 (out of 1 available) 32980 1727096602.11406: exiting _queue_task() for managed_node2/service 32980 1727096602.11415: done queuing things up, now waiting for results queue to drain 32980 1727096602.11416: waiting for pending results... 32980 1727096602.12089: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 32980 1727096602.12093: in run() - task 0afff68d-5257-457d-ef33-000000000024 32980 1727096602.12097: variable 'ansible_search_path' from source: unknown 32980 1727096602.12099: variable 'ansible_search_path' from source: unknown 32980 1727096602.12137: calling self._execute() 32980 1727096602.12475: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096602.12480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096602.12482: variable 'omit' from source: magic vars 32980 1727096602.13243: variable 'ansible_distribution_major_version' from source: facts 32980 1727096602.13260: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096602.13386: variable 'network_provider' from source: set_fact 32980 1727096602.13398: Evaluated conditional (network_provider == "initscripts"): False 32980 1727096602.13417: when evaluation is False, skipping this task 32980 1727096602.13427: _execute() done 32980 1727096602.13434: dumping result to json 32980 1727096602.13442: done dumping result, returning 32980 1727096602.13454: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-457d-ef33-000000000024] 32980 1727096602.13464: sending task result for task 0afff68d-5257-457d-ef33-000000000024 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096602.13669: no more pending results, returning what we have 32980 1727096602.13673: results queue empty 32980 1727096602.13674: checking for any_errors_fatal 32980 1727096602.13685: done checking for any_errors_fatal 32980 1727096602.13686: checking for max_fail_percentage 32980 1727096602.13688: done checking for max_fail_percentage 32980 1727096602.13734: checking to see if all hosts have failed and the running result is not ok 32980 1727096602.13736: done checking to see if all hosts have failed 32980 1727096602.13736: getting the remaining hosts for this loop 32980 1727096602.13738: done getting the remaining hosts for this loop 32980 1727096602.13743: getting the next task for host managed_node2 32980 1727096602.13752: done getting next task for host managed_node2 32980 1727096602.13756: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32980 1727096602.13760: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096602.13818: getting variables 32980 1727096602.13820: in VariableManager get_vars() 32980 1727096602.13954: Calling all_inventory to load vars for managed_node2 32980 1727096602.13957: Calling groups_inventory to load vars for managed_node2 32980 1727096602.13960: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096602.14034: Calling all_plugins_play to load vars for managed_node2 32980 1727096602.14038: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096602.14041: Calling groups_plugins_play to load vars for managed_node2 32980 1727096602.14573: done sending task result for task 0afff68d-5257-457d-ef33-000000000024 32980 1727096602.14577: WORKER PROCESS EXITING 32980 1727096602.15953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096602.18200: done with get_vars() 32980 1727096602.18223: done getting variables 32980 1727096602.18292: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:03:22 -0400 (0:00:00.076) 0:00:14.109 ****** 32980 1727096602.18326: entering _queue_task() for managed_node2/copy 32980 1727096602.19158: worker is 1 (out of 1 available) 32980 1727096602.19180: exiting _queue_task() for managed_node2/copy 32980 1727096602.19194: done queuing things up, now waiting for results queue to drain 32980 1727096602.19195: waiting for pending results... 32980 1727096602.19438: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32980 1727096602.19550: in run() - task 0afff68d-5257-457d-ef33-000000000025 32980 1727096602.19563: variable 'ansible_search_path' from source: unknown 32980 1727096602.19566: variable 'ansible_search_path' from source: unknown 32980 1727096602.19720: calling self._execute() 32980 1727096602.19723: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096602.19727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096602.19729: variable 'omit' from source: magic vars 32980 1727096602.20061: variable 'ansible_distribution_major_version' from source: facts 32980 1727096602.20077: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096602.20171: variable 'network_provider' from source: set_fact 32980 1727096602.20177: Evaluated conditional (network_provider == "initscripts"): False 32980 1727096602.20180: when evaluation is False, skipping this task 32980 1727096602.20183: _execute() done 32980 1727096602.20185: dumping result to json 32980 1727096602.20188: done dumping result, returning 32980 1727096602.20197: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-457d-ef33-000000000025] 32980 1727096602.20201: sending task result for task 0afff68d-5257-457d-ef33-000000000025 32980 1727096602.20303: done sending task result for task 0afff68d-5257-457d-ef33-000000000025 32980 1727096602.20305: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 32980 1727096602.20347: no more pending results, returning what we have 32980 1727096602.20351: results queue empty 32980 1727096602.20351: checking for any_errors_fatal 32980 1727096602.20357: done checking for any_errors_fatal 32980 1727096602.20357: checking for max_fail_percentage 32980 1727096602.20359: done checking for max_fail_percentage 32980 1727096602.20360: checking to see if all hosts have failed and the running result is not ok 32980 1727096602.20361: done checking to see if all hosts have failed 32980 1727096602.20362: getting the remaining hosts for this loop 32980 1727096602.20363: done getting the remaining hosts for this loop 32980 1727096602.20370: getting the next task for host managed_node2 32980 1727096602.20380: done getting next task for host managed_node2 32980 1727096602.20383: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32980 1727096602.20387: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096602.20402: getting variables 32980 1727096602.20403: in VariableManager get_vars() 32980 1727096602.20439: Calling all_inventory to load vars for managed_node2 32980 1727096602.20441: Calling groups_inventory to load vars for managed_node2 32980 1727096602.20443: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096602.20451: Calling all_plugins_play to load vars for managed_node2 32980 1727096602.20453: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096602.20455: Calling groups_plugins_play to load vars for managed_node2 32980 1727096602.22909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096602.26621: done with get_vars() 32980 1727096602.26653: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:03:22 -0400 (0:00:00.085) 0:00:14.195 ****** 32980 1727096602.26864: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 32980 1727096602.26866: Creating lock for fedora.linux_system_roles.network_connections 32980 1727096602.27566: worker is 1 (out of 1 available) 32980 1727096602.27583: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 32980 1727096602.27596: done queuing things up, now waiting for results queue to drain 32980 1727096602.27598: waiting for pending results... 32980 1727096602.28192: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32980 1727096602.28508: in run() - task 0afff68d-5257-457d-ef33-000000000026 32980 1727096602.28512: variable 'ansible_search_path' from source: unknown 32980 1727096602.28515: variable 'ansible_search_path' from source: unknown 32980 1727096602.28571: calling self._execute() 32980 1727096602.28943: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096602.28947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096602.28950: variable 'omit' from source: magic vars 32980 1727096602.29786: variable 'ansible_distribution_major_version' from source: facts 32980 1727096602.29825: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096602.29885: variable 'omit' from source: magic vars 32980 1727096602.30136: variable 'omit' from source: magic vars 32980 1727096602.30330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096602.35209: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096602.35474: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096602.35480: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096602.35706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096602.35709: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096602.35812: variable 'network_provider' from source: set_fact 32980 1727096602.36149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096602.36172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096602.36204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096602.36308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096602.36386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096602.36575: variable 'omit' from source: magic vars 32980 1727096602.36809: variable 'omit' from source: magic vars 32980 1727096602.36904: variable 'network_connections' from source: task vars 32980 1727096602.36984: variable 'interface' from source: play vars 32980 1727096602.37236: variable 'interface' from source: play vars 32980 1727096602.37240: variable 'vlan_interface' from source: play vars 32980 1727096602.37242: variable 'vlan_interface' from source: play vars 32980 1727096602.37244: variable 'interface' from source: play vars 32980 1727096602.37398: variable 'interface' from source: play vars 32980 1727096602.37815: variable 'omit' from source: magic vars 32980 1727096602.37823: variable '__lsr_ansible_managed' from source: task vars 32980 1727096602.38008: variable '__lsr_ansible_managed' from source: task vars 32980 1727096602.38465: Loaded config def from plugin (lookup/template) 32980 1727096602.38471: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 32980 1727096602.38501: File lookup term: get_ansible_managed.j2 32980 1727096602.38504: variable 'ansible_search_path' from source: unknown 32980 1727096602.38507: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 32980 1727096602.38520: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 32980 1727096602.38536: variable 'ansible_search_path' from source: unknown 32980 1727096602.47723: variable 'ansible_managed' from source: unknown 32980 1727096602.47978: variable 'omit' from source: magic vars 32980 1727096602.47982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096602.48003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096602.48021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096602.48084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096602.48088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096602.48282: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096602.48286: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096602.48288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096602.48485: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096602.48491: Set connection var ansible_timeout to 10 32980 1727096602.48497: Set connection var ansible_shell_type to sh 32980 1727096602.48500: Set connection var ansible_connection to ssh 32980 1727096602.48517: Set connection var ansible_shell_executable to /bin/sh 32980 1727096602.48520: Set connection var ansible_pipelining to False 32980 1727096602.48554: variable 'ansible_shell_executable' from source: unknown 32980 1727096602.48557: variable 'ansible_connection' from source: unknown 32980 1727096602.48559: variable 'ansible_module_compression' from source: unknown 32980 1727096602.48561: variable 'ansible_shell_type' from source: unknown 32980 1727096602.48564: variable 'ansible_shell_executable' from source: unknown 32980 1727096602.48565: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096602.48570: variable 'ansible_pipelining' from source: unknown 32980 1727096602.48689: variable 'ansible_timeout' from source: unknown 32980 1727096602.48692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096602.48969: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096602.48975: variable 'omit' from source: magic vars 32980 1727096602.48978: starting attempt loop 32980 1727096602.48981: running the handler 32980 1727096602.49034: _low_level_execute_command(): starting 32980 1727096602.49037: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096602.50158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096602.50162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096602.50202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096602.51899: stdout chunk (state=3): >>>/root <<< 32980 1727096602.52062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096602.52066: stdout chunk (state=3): >>><<< 32980 1727096602.52071: stderr chunk (state=3): >>><<< 32980 1727096602.52116: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096602.52213: _low_level_execute_command(): starting 32980 1727096602.52217: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450 `" && echo ansible-tmp-1727096602.5212295-33686-19463362336450="` echo /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450 `" ) && sleep 0' 32980 1727096602.53041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096602.53100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096602.53338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096602.53364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096602.53448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096602.53513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096602.53765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096602.55564: stdout chunk (state=3): >>>ansible-tmp-1727096602.5212295-33686-19463362336450=/root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450 <<< 32980 1727096602.55683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096602.55718: stderr chunk (state=3): >>><<< 32980 1727096602.55722: stdout chunk (state=3): >>><<< 32980 1727096602.55740: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096602.5212295-33686-19463362336450=/root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096602.55790: variable 'ansible_module_compression' from source: unknown 32980 1727096602.55835: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 32980 1727096602.55839: ANSIBALLZ: Acquiring lock 32980 1727096602.55841: ANSIBALLZ: Lock acquired: 140258564248672 32980 1727096602.55843: ANSIBALLZ: Creating module 32980 1727096602.78340: ANSIBALLZ: Writing module into payload 32980 1727096602.78613: ANSIBALLZ: Writing module 32980 1727096602.78630: ANSIBALLZ: Renaming module 32980 1727096602.78636: ANSIBALLZ: Done creating module 32980 1727096602.78662: variable 'ansible_facts' from source: unknown 32980 1727096602.78730: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/AnsiballZ_network_connections.py 32980 1727096602.78842: Sending initial data 32980 1727096602.78845: Sent initial data (167 bytes) 32980 1727096602.79262: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096602.79312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096602.79316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096602.79318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096602.79366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096602.79374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096602.79380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096602.79419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096602.81107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096602.81125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096602.81173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmphdkvfh2h /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/AnsiballZ_network_connections.py <<< 32980 1727096602.81176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/AnsiballZ_network_connections.py" <<< 32980 1727096602.81201: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmphdkvfh2h" to remote "/root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/AnsiballZ_network_connections.py" <<< 32980 1727096602.81207: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/AnsiballZ_network_connections.py" <<< 32980 1727096602.82297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096602.82362: stderr chunk (state=3): >>><<< 32980 1727096602.82366: stdout chunk (state=3): >>><<< 32980 1727096602.82391: done transferring module to remote 32980 1727096602.82404: _low_level_execute_command(): starting 32980 1727096602.82407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/ /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/AnsiballZ_network_connections.py && sleep 0' 32980 1727096602.83114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096602.83262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096602.83266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096602.83270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096602.83275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096602.83310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096602.85240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096602.85244: stdout chunk (state=3): >>><<< 32980 1727096602.85247: stderr chunk (state=3): >>><<< 32980 1727096602.85282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096602.85362: _low_level_execute_command(): starting 32980 1727096602.85366: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/AnsiballZ_network_connections.py && sleep 0' 32980 1727096602.86014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096602.86053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096602.86085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096602.86144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096603.20670: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 32980 1727096603.24172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096603.24176: stdout chunk (state=3): >>><<< 32980 1727096603.24179: stderr chunk (state=3): >>><<< 32980 1727096603.24181: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096603.24240: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'type': 'ethernet', 'state': 'up', 'mtu': 1492, 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}, {'name': 'lsr101.90', 'parent': 'lsr101', 'type': 'vlan', 'vlan_id': 90, 'mtu': 1280, 'state': 'up', 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096603.24282: _low_level_execute_command(): starting 32980 1727096603.24285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096602.5212295-33686-19463362336450/ > /dev/null 2>&1 && sleep 0' 32980 1727096603.24716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096603.24721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096603.24723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096603.24727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096603.24776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096603.24783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096603.24788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096603.24816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096603.26879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096603.26889: stdout chunk (state=3): >>><<< 32980 1727096603.26892: stderr chunk (state=3): >>><<< 32980 1727096603.26896: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096603.26899: handler run complete 32980 1727096603.26901: attempt loop complete, returning result 32980 1727096603.26902: _execute() done 32980 1727096603.26904: dumping result to json 32980 1727096603.26906: done dumping result, returning 32980 1727096603.26908: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-457d-ef33-000000000026] 32980 1727096603.26910: sending task result for task 0afff68d-5257-457d-ef33-000000000026 32980 1727096603.27001: done sending task result for task 0afff68d-5257-457d-ef33-000000000026 32980 1727096603.27004: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65 [006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113 [007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65 (not-active) [008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113 (not-active) 32980 1727096603.27139: no more pending results, returning what we have 32980 1727096603.27143: results queue empty 32980 1727096603.27144: checking for any_errors_fatal 32980 1727096603.27152: done checking for any_errors_fatal 32980 1727096603.27153: checking for max_fail_percentage 32980 1727096603.27155: done checking for max_fail_percentage 32980 1727096603.27156: checking to see if all hosts have failed and the running result is not ok 32980 1727096603.27157: done checking to see if all hosts have failed 32980 1727096603.27157: getting the remaining hosts for this loop 32980 1727096603.27159: done getting the remaining hosts for this loop 32980 1727096603.27163: getting the next task for host managed_node2 32980 1727096603.27279: done getting next task for host managed_node2 32980 1727096603.27284: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 32980 1727096603.27287: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096603.27298: getting variables 32980 1727096603.27300: in VariableManager get_vars() 32980 1727096603.27343: Calling all_inventory to load vars for managed_node2 32980 1727096603.27347: Calling groups_inventory to load vars for managed_node2 32980 1727096603.27349: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096603.27360: Calling all_plugins_play to load vars for managed_node2 32980 1727096603.27363: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096603.27365: Calling groups_plugins_play to load vars for managed_node2 32980 1727096603.28479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096603.29514: done with get_vars() 32980 1727096603.29534: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:03:23 -0400 (0:00:01.027) 0:00:15.222 ****** 32980 1727096603.29629: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 32980 1727096603.29631: Creating lock for fedora.linux_system_roles.network_state 32980 1727096603.29985: worker is 1 (out of 1 available) 32980 1727096603.29998: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 32980 1727096603.30014: done queuing things up, now waiting for results queue to drain 32980 1727096603.30015: waiting for pending results... 32980 1727096603.30289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 32980 1727096603.30414: in run() - task 0afff68d-5257-457d-ef33-000000000027 32980 1727096603.30435: variable 'ansible_search_path' from source: unknown 32980 1727096603.30439: variable 'ansible_search_path' from source: unknown 32980 1727096603.30480: calling self._execute() 32980 1727096603.30581: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.30585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.30600: variable 'omit' from source: magic vars 32980 1727096603.31255: variable 'ansible_distribution_major_version' from source: facts 32980 1727096603.31258: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096603.31432: variable 'network_state' from source: role '' defaults 32980 1727096603.31444: Evaluated conditional (network_state != {}): False 32980 1727096603.31448: when evaluation is False, skipping this task 32980 1727096603.31451: _execute() done 32980 1727096603.31454: dumping result to json 32980 1727096603.31456: done dumping result, returning 32980 1727096603.31582: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-457d-ef33-000000000027] 32980 1727096603.31585: sending task result for task 0afff68d-5257-457d-ef33-000000000027 32980 1727096603.31647: done sending task result for task 0afff68d-5257-457d-ef33-000000000027 32980 1727096603.31650: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096603.31728: no more pending results, returning what we have 32980 1727096603.31730: results queue empty 32980 1727096603.31731: checking for any_errors_fatal 32980 1727096603.31738: done checking for any_errors_fatal 32980 1727096603.31738: checking for max_fail_percentage 32980 1727096603.31740: done checking for max_fail_percentage 32980 1727096603.31741: checking to see if all hosts have failed and the running result is not ok 32980 1727096603.31742: done checking to see if all hosts have failed 32980 1727096603.31742: getting the remaining hosts for this loop 32980 1727096603.31744: done getting the remaining hosts for this loop 32980 1727096603.31746: getting the next task for host managed_node2 32980 1727096603.31752: done getting next task for host managed_node2 32980 1727096603.31756: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32980 1727096603.31758: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096603.31781: getting variables 32980 1727096603.31782: in VariableManager get_vars() 32980 1727096603.31816: Calling all_inventory to load vars for managed_node2 32980 1727096603.31819: Calling groups_inventory to load vars for managed_node2 32980 1727096603.31821: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096603.31829: Calling all_plugins_play to load vars for managed_node2 32980 1727096603.31832: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096603.31835: Calling groups_plugins_play to load vars for managed_node2 32980 1727096603.32735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096603.34038: done with get_vars() 32980 1727096603.34074: done getting variables 32980 1727096603.34134: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:03:23 -0400 (0:00:00.045) 0:00:15.268 ****** 32980 1727096603.34171: entering _queue_task() for managed_node2/debug 32980 1727096603.34466: worker is 1 (out of 1 available) 32980 1727096603.34481: exiting _queue_task() for managed_node2/debug 32980 1727096603.34493: done queuing things up, now waiting for results queue to drain 32980 1727096603.34494: waiting for pending results... 32980 1727096603.34782: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32980 1727096603.34882: in run() - task 0afff68d-5257-457d-ef33-000000000028 32980 1727096603.34894: variable 'ansible_search_path' from source: unknown 32980 1727096603.34898: variable 'ansible_search_path' from source: unknown 32980 1727096603.34932: calling self._execute() 32980 1727096603.35021: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.35024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.35036: variable 'omit' from source: magic vars 32980 1727096603.35397: variable 'ansible_distribution_major_version' from source: facts 32980 1727096603.35408: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096603.35414: variable 'omit' from source: magic vars 32980 1727096603.35532: variable 'omit' from source: magic vars 32980 1727096603.35536: variable 'omit' from source: magic vars 32980 1727096603.35552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096603.35587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096603.35609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096603.35627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096603.35643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096603.35677: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096603.35681: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.35683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.35786: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096603.35858: Set connection var ansible_timeout to 10 32980 1727096603.35862: Set connection var ansible_shell_type to sh 32980 1727096603.35864: Set connection var ansible_connection to ssh 32980 1727096603.35866: Set connection var ansible_shell_executable to /bin/sh 32980 1727096603.35870: Set connection var ansible_pipelining to False 32980 1727096603.35875: variable 'ansible_shell_executable' from source: unknown 32980 1727096603.35878: variable 'ansible_connection' from source: unknown 32980 1727096603.35880: variable 'ansible_module_compression' from source: unknown 32980 1727096603.35882: variable 'ansible_shell_type' from source: unknown 32980 1727096603.35884: variable 'ansible_shell_executable' from source: unknown 32980 1727096603.35886: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.35888: variable 'ansible_pipelining' from source: unknown 32980 1727096603.35891: variable 'ansible_timeout' from source: unknown 32980 1727096603.35892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.35976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096603.35984: variable 'omit' from source: magic vars 32980 1727096603.35989: starting attempt loop 32980 1727096603.35992: running the handler 32980 1727096603.36184: variable '__network_connections_result' from source: set_fact 32980 1727096603.36211: handler run complete 32980 1727096603.36230: attempt loop complete, returning result 32980 1727096603.36233: _execute() done 32980 1727096603.36236: dumping result to json 32980 1727096603.36240: done dumping result, returning 32980 1727096603.36247: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-457d-ef33-000000000028] 32980 1727096603.36252: sending task result for task 0afff68d-5257-457d-ef33-000000000028 32980 1727096603.36354: done sending task result for task 0afff68d-5257-457d-ef33-000000000028 32980 1727096603.36357: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65 (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113 (not-active)" ] } 32980 1727096603.36464: no more pending results, returning what we have 32980 1727096603.36470: results queue empty 32980 1727096603.36471: checking for any_errors_fatal 32980 1727096603.36480: done checking for any_errors_fatal 32980 1727096603.36481: checking for max_fail_percentage 32980 1727096603.36483: done checking for max_fail_percentage 32980 1727096603.36484: checking to see if all hosts have failed and the running result is not ok 32980 1727096603.36484: done checking to see if all hosts have failed 32980 1727096603.36485: getting the remaining hosts for this loop 32980 1727096603.36487: done getting the remaining hosts for this loop 32980 1727096603.36491: getting the next task for host managed_node2 32980 1727096603.36500: done getting next task for host managed_node2 32980 1727096603.36504: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32980 1727096603.36508: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096603.36520: getting variables 32980 1727096603.36522: in VariableManager get_vars() 32980 1727096603.36565: Calling all_inventory to load vars for managed_node2 32980 1727096603.36673: Calling groups_inventory to load vars for managed_node2 32980 1727096603.36677: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096603.36688: Calling all_plugins_play to load vars for managed_node2 32980 1727096603.36691: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096603.36695: Calling groups_plugins_play to load vars for managed_node2 32980 1727096603.38143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096603.39654: done with get_vars() 32980 1727096603.39679: done getting variables 32980 1727096603.39738: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:03:23 -0400 (0:00:00.056) 0:00:15.324 ****** 32980 1727096603.39780: entering _queue_task() for managed_node2/debug 32980 1727096603.40092: worker is 1 (out of 1 available) 32980 1727096603.40105: exiting _queue_task() for managed_node2/debug 32980 1727096603.40116: done queuing things up, now waiting for results queue to drain 32980 1727096603.40117: waiting for pending results... 32980 1727096603.40489: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32980 1727096603.40516: in run() - task 0afff68d-5257-457d-ef33-000000000029 32980 1727096603.40555: variable 'ansible_search_path' from source: unknown 32980 1727096603.40558: variable 'ansible_search_path' from source: unknown 32980 1727096603.40566: calling self._execute() 32980 1727096603.40677: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.40681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.40684: variable 'omit' from source: magic vars 32980 1727096603.41100: variable 'ansible_distribution_major_version' from source: facts 32980 1727096603.41104: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096603.41106: variable 'omit' from source: magic vars 32980 1727096603.41115: variable 'omit' from source: magic vars 32980 1727096603.41155: variable 'omit' from source: magic vars 32980 1727096603.41198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096603.41232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096603.41253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096603.41317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096603.41320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096603.41322: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096603.41325: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.41327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.41429: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096603.41434: Set connection var ansible_timeout to 10 32980 1727096603.41438: Set connection var ansible_shell_type to sh 32980 1727096603.41440: Set connection var ansible_connection to ssh 32980 1727096603.41447: Set connection var ansible_shell_executable to /bin/sh 32980 1727096603.41453: Set connection var ansible_pipelining to False 32980 1727096603.41536: variable 'ansible_shell_executable' from source: unknown 32980 1727096603.41539: variable 'ansible_connection' from source: unknown 32980 1727096603.41542: variable 'ansible_module_compression' from source: unknown 32980 1727096603.41544: variable 'ansible_shell_type' from source: unknown 32980 1727096603.41546: variable 'ansible_shell_executable' from source: unknown 32980 1727096603.41548: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.41550: variable 'ansible_pipelining' from source: unknown 32980 1727096603.41552: variable 'ansible_timeout' from source: unknown 32980 1727096603.41554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.41642: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096603.41647: variable 'omit' from source: magic vars 32980 1727096603.41652: starting attempt loop 32980 1727096603.41655: running the handler 32980 1727096603.41711: variable '__network_connections_result' from source: set_fact 32980 1727096603.41786: variable '__network_connections_result' from source: set_fact 32980 1727096603.41975: handler run complete 32980 1727096603.41978: attempt loop complete, returning result 32980 1727096603.41981: _execute() done 32980 1727096603.41983: dumping result to json 32980 1727096603.41985: done dumping result, returning 32980 1727096603.41993: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-457d-ef33-000000000029] 32980 1727096603.41996: sending task result for task 0afff68d-5257-457d-ef33-000000000029 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113 (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9c06f46-3096-44b6-a493-93c164acfa65 (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, fec20eaf-3c2b-4545-97bb-baae47791113 (not-active)" ] } } 32980 1727096603.42234: done sending task result for task 0afff68d-5257-457d-ef33-000000000029 32980 1727096603.42237: WORKER PROCESS EXITING 32980 1727096603.42415: no more pending results, returning what we have 32980 1727096603.42419: results queue empty 32980 1727096603.42420: checking for any_errors_fatal 32980 1727096603.42424: done checking for any_errors_fatal 32980 1727096603.42425: checking for max_fail_percentage 32980 1727096603.42427: done checking for max_fail_percentage 32980 1727096603.42428: checking to see if all hosts have failed and the running result is not ok 32980 1727096603.42429: done checking to see if all hosts have failed 32980 1727096603.42429: getting the remaining hosts for this loop 32980 1727096603.42431: done getting the remaining hosts for this loop 32980 1727096603.42434: getting the next task for host managed_node2 32980 1727096603.42440: done getting next task for host managed_node2 32980 1727096603.42448: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32980 1727096603.42451: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096603.42461: getting variables 32980 1727096603.42463: in VariableManager get_vars() 32980 1727096603.42498: Calling all_inventory to load vars for managed_node2 32980 1727096603.42501: Calling groups_inventory to load vars for managed_node2 32980 1727096603.42503: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096603.42511: Calling all_plugins_play to load vars for managed_node2 32980 1727096603.42514: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096603.42517: Calling groups_plugins_play to load vars for managed_node2 32980 1727096603.43743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096603.45354: done with get_vars() 32980 1727096603.45379: done getting variables 32980 1727096603.45437: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:03:23 -0400 (0:00:00.056) 0:00:15.381 ****** 32980 1727096603.45472: entering _queue_task() for managed_node2/debug 32980 1727096603.45762: worker is 1 (out of 1 available) 32980 1727096603.45975: exiting _queue_task() for managed_node2/debug 32980 1727096603.45985: done queuing things up, now waiting for results queue to drain 32980 1727096603.45986: waiting for pending results... 32980 1727096603.46087: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32980 1727096603.46193: in run() - task 0afff68d-5257-457d-ef33-00000000002a 32980 1727096603.46211: variable 'ansible_search_path' from source: unknown 32980 1727096603.46214: variable 'ansible_search_path' from source: unknown 32980 1727096603.46372: calling self._execute() 32980 1727096603.46378: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.46381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.46383: variable 'omit' from source: magic vars 32980 1727096603.46783: variable 'ansible_distribution_major_version' from source: facts 32980 1727096603.46786: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096603.46826: variable 'network_state' from source: role '' defaults 32980 1727096603.46835: Evaluated conditional (network_state != {}): False 32980 1727096603.46838: when evaluation is False, skipping this task 32980 1727096603.46841: _execute() done 32980 1727096603.46844: dumping result to json 32980 1727096603.46846: done dumping result, returning 32980 1727096603.46854: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-457d-ef33-00000000002a] 32980 1727096603.46863: sending task result for task 0afff68d-5257-457d-ef33-00000000002a 32980 1727096603.47091: done sending task result for task 0afff68d-5257-457d-ef33-00000000002a 32980 1727096603.47094: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 32980 1727096603.47129: no more pending results, returning what we have 32980 1727096603.47132: results queue empty 32980 1727096603.47133: checking for any_errors_fatal 32980 1727096603.47140: done checking for any_errors_fatal 32980 1727096603.47141: checking for max_fail_percentage 32980 1727096603.47143: done checking for max_fail_percentage 32980 1727096603.47144: checking to see if all hosts have failed and the running result is not ok 32980 1727096603.47145: done checking to see if all hosts have failed 32980 1727096603.47146: getting the remaining hosts for this loop 32980 1727096603.47147: done getting the remaining hosts for this loop 32980 1727096603.47150: getting the next task for host managed_node2 32980 1727096603.47156: done getting next task for host managed_node2 32980 1727096603.47160: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 32980 1727096603.47163: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096603.47178: getting variables 32980 1727096603.47180: in VariableManager get_vars() 32980 1727096603.47213: Calling all_inventory to load vars for managed_node2 32980 1727096603.47216: Calling groups_inventory to load vars for managed_node2 32980 1727096603.47218: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096603.47226: Calling all_plugins_play to load vars for managed_node2 32980 1727096603.47229: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096603.47231: Calling groups_plugins_play to load vars for managed_node2 32980 1727096603.48505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096603.50063: done with get_vars() 32980 1727096603.50089: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:03:23 -0400 (0:00:00.047) 0:00:15.428 ****** 32980 1727096603.50185: entering _queue_task() for managed_node2/ping 32980 1727096603.50187: Creating lock for ping 32980 1727096603.50515: worker is 1 (out of 1 available) 32980 1727096603.50528: exiting _queue_task() for managed_node2/ping 32980 1727096603.50539: done queuing things up, now waiting for results queue to drain 32980 1727096603.50540: waiting for pending results... 32980 1727096603.50889: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 32980 1727096603.50929: in run() - task 0afff68d-5257-457d-ef33-00000000002b 32980 1727096603.50941: variable 'ansible_search_path' from source: unknown 32980 1727096603.50945: variable 'ansible_search_path' from source: unknown 32980 1727096603.50980: calling self._execute() 32980 1727096603.51105: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.51109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.51252: variable 'omit' from source: magic vars 32980 1727096603.51665: variable 'ansible_distribution_major_version' from source: facts 32980 1727096603.51671: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096603.51675: variable 'omit' from source: magic vars 32980 1727096603.51678: variable 'omit' from source: magic vars 32980 1727096603.51680: variable 'omit' from source: magic vars 32980 1727096603.51879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096603.51913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096603.51935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096603.51951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096603.51963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096603.51998: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096603.52001: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.52003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.52218: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096603.52223: Set connection var ansible_timeout to 10 32980 1727096603.52226: Set connection var ansible_shell_type to sh 32980 1727096603.52228: Set connection var ansible_connection to ssh 32980 1727096603.52236: Set connection var ansible_shell_executable to /bin/sh 32980 1727096603.52241: Set connection var ansible_pipelining to False 32980 1727096603.52266: variable 'ansible_shell_executable' from source: unknown 32980 1727096603.52272: variable 'ansible_connection' from source: unknown 32980 1727096603.52286: variable 'ansible_module_compression' from source: unknown 32980 1727096603.52289: variable 'ansible_shell_type' from source: unknown 32980 1727096603.52291: variable 'ansible_shell_executable' from source: unknown 32980 1727096603.52294: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096603.52298: variable 'ansible_pipelining' from source: unknown 32980 1727096603.52301: variable 'ansible_timeout' from source: unknown 32980 1727096603.52310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096603.52723: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096603.52732: variable 'omit' from source: magic vars 32980 1727096603.52748: starting attempt loop 32980 1727096603.52751: running the handler 32980 1727096603.52754: _low_level_execute_command(): starting 32980 1727096603.52776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096603.53730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096603.53736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096603.53740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096603.53743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096603.53766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096603.53836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096603.55751: stdout chunk (state=3): >>>/root <<< 32980 1727096603.55755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096603.55762: stdout chunk (state=3): >>><<< 32980 1727096603.55765: stderr chunk (state=3): >>><<< 32980 1727096603.55905: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096603.55909: _low_level_execute_command(): starting 32980 1727096603.55913: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390 `" && echo ansible-tmp-1727096603.5578935-33752-143883711516390="` echo /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390 `" ) && sleep 0' 32980 1727096603.57001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096603.57092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096603.57102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096603.57118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096603.57129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096603.57136: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096603.57148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096603.57159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096603.57171: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096603.57178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096603.57187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096603.57197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096603.57321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096603.57370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096603.57483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096603.57515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096603.59458: stdout chunk (state=3): >>>ansible-tmp-1727096603.5578935-33752-143883711516390=/root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390 <<< 32980 1727096603.59608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096603.59612: stdout chunk (state=3): >>><<< 32980 1727096603.59683: stderr chunk (state=3): >>><<< 32980 1727096603.59686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096603.5578935-33752-143883711516390=/root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096603.59689: variable 'ansible_module_compression' from source: unknown 32980 1727096603.59733: ANSIBALLZ: Using lock for ping 32980 1727096603.59736: ANSIBALLZ: Acquiring lock 32980 1727096603.59739: ANSIBALLZ: Lock acquired: 140258564370992 32980 1727096603.59744: ANSIBALLZ: Creating module 32980 1727096603.80377: ANSIBALLZ: Writing module into payload 32980 1727096603.80381: ANSIBALLZ: Writing module 32980 1727096603.80383: ANSIBALLZ: Renaming module 32980 1727096603.80386: ANSIBALLZ: Done creating module 32980 1727096603.80388: variable 'ansible_facts' from source: unknown 32980 1727096603.80687: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/AnsiballZ_ping.py 32980 1727096603.80892: Sending initial data 32980 1727096603.80901: Sent initial data (153 bytes) 32980 1727096603.82288: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096603.82304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096603.82452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096603.84163: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096603.84194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096603.84226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp3nfj0vuw /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/AnsiballZ_ping.py <<< 32980 1727096603.84239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/AnsiballZ_ping.py" <<< 32980 1727096603.84394: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp3nfj0vuw" to remote "/root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/AnsiballZ_ping.py" <<< 32980 1727096603.84488: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/AnsiballZ_ping.py" <<< 32980 1727096603.85563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096603.85641: stderr chunk (state=3): >>><<< 32980 1727096603.85644: stdout chunk (state=3): >>><<< 32980 1727096603.85781: done transferring module to remote 32980 1727096603.85796: _low_level_execute_command(): starting 32980 1727096603.85804: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/ /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/AnsiballZ_ping.py && sleep 0' 32980 1727096603.87015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096603.87028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096603.87044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096603.87213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096603.87225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096603.87396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096603.87586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096603.89293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096603.89321: stderr chunk (state=3): >>><<< 32980 1727096603.89330: stdout chunk (state=3): >>><<< 32980 1727096603.89354: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096603.89363: _low_level_execute_command(): starting 32980 1727096603.89526: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/AnsiballZ_ping.py && sleep 0' 32980 1727096603.90123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096603.90139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096603.90156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096603.90181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096603.90198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096603.90299: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096603.90312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096603.90638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.05871: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 32980 1727096604.07248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096604.07256: stdout chunk (state=3): >>><<< 32980 1727096604.07258: stderr chunk (state=3): >>><<< 32980 1727096604.07377: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096604.07381: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096604.07383: _low_level_execute_command(): starting 32980 1727096604.07385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096603.5578935-33752-143883711516390/ > /dev/null 2>&1 && sleep 0' 32980 1727096604.08151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.08207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.08230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.08252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.08322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.10223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.10227: stdout chunk (state=3): >>><<< 32980 1727096604.10282: stderr chunk (state=3): >>><<< 32980 1727096604.10286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096604.10289: handler run complete 32980 1727096604.10292: attempt loop complete, returning result 32980 1727096604.10294: _execute() done 32980 1727096604.10296: dumping result to json 32980 1727096604.10298: done dumping result, returning 32980 1727096604.10304: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-457d-ef33-00000000002b] 32980 1727096604.10306: sending task result for task 0afff68d-5257-457d-ef33-00000000002b 32980 1727096604.10402: done sending task result for task 0afff68d-5257-457d-ef33-00000000002b 32980 1727096604.10404: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 32980 1727096604.10498: no more pending results, returning what we have 32980 1727096604.10502: results queue empty 32980 1727096604.10503: checking for any_errors_fatal 32980 1727096604.10510: done checking for any_errors_fatal 32980 1727096604.10511: checking for max_fail_percentage 32980 1727096604.10513: done checking for max_fail_percentage 32980 1727096604.10514: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.10514: done checking to see if all hosts have failed 32980 1727096604.10515: getting the remaining hosts for this loop 32980 1727096604.10517: done getting the remaining hosts for this loop 32980 1727096604.10521: getting the next task for host managed_node2 32980 1727096604.10532: done getting next task for host managed_node2 32980 1727096604.10535: ^ task is: TASK: meta (role_complete) 32980 1727096604.10538: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.10550: getting variables 32980 1727096604.10552: in VariableManager get_vars() 32980 1727096604.10603: Calling all_inventory to load vars for managed_node2 32980 1727096604.10607: Calling groups_inventory to load vars for managed_node2 32980 1727096604.10609: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.10622: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.10625: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.10629: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.12654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.14443: done with get_vars() 32980 1727096604.14465: done getting variables 32980 1727096604.14558: done queuing things up, now waiting for results queue to drain 32980 1727096604.14561: results queue empty 32980 1727096604.14561: checking for any_errors_fatal 32980 1727096604.14564: done checking for any_errors_fatal 32980 1727096604.14565: checking for max_fail_percentage 32980 1727096604.14566: done checking for max_fail_percentage 32980 1727096604.14567: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.14578: done checking to see if all hosts have failed 32980 1727096604.14578: getting the remaining hosts for this loop 32980 1727096604.14579: done getting the remaining hosts for this loop 32980 1727096604.14582: getting the next task for host managed_node2 32980 1727096604.14591: done getting next task for host managed_node2 32980 1727096604.14593: ^ task is: TASK: Include the task 'assert_device_present.yml' 32980 1727096604.14595: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.14597: getting variables 32980 1727096604.14598: in VariableManager get_vars() 32980 1727096604.14612: Calling all_inventory to load vars for managed_node2 32980 1727096604.14614: Calling groups_inventory to load vars for managed_node2 32980 1727096604.14616: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.14621: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.14624: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.14626: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.16014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.17567: done with get_vars() 32980 1727096604.17588: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:46 Monday 23 September 2024 09:03:24 -0400 (0:00:00.674) 0:00:16.103 ****** 32980 1727096604.17657: entering _queue_task() for managed_node2/include_tasks 32980 1727096604.17966: worker is 1 (out of 1 available) 32980 1727096604.17980: exiting _queue_task() for managed_node2/include_tasks 32980 1727096604.17991: done queuing things up, now waiting for results queue to drain 32980 1727096604.17993: waiting for pending results... 32980 1727096604.18288: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 32980 1727096604.18386: in run() - task 0afff68d-5257-457d-ef33-00000000005b 32980 1727096604.18391: variable 'ansible_search_path' from source: unknown 32980 1727096604.18424: calling self._execute() 32980 1727096604.18574: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.18577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.18579: variable 'omit' from source: magic vars 32980 1727096604.18910: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.18933: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.18944: _execute() done 32980 1727096604.18953: dumping result to json 32980 1727096604.18960: done dumping result, returning 32980 1727096604.18974: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-457d-ef33-00000000005b] 32980 1727096604.18984: sending task result for task 0afff68d-5257-457d-ef33-00000000005b 32980 1727096604.19230: done sending task result for task 0afff68d-5257-457d-ef33-00000000005b 32980 1727096604.19233: WORKER PROCESS EXITING 32980 1727096604.19260: no more pending results, returning what we have 32980 1727096604.19265: in VariableManager get_vars() 32980 1727096604.19317: Calling all_inventory to load vars for managed_node2 32980 1727096604.19321: Calling groups_inventory to load vars for managed_node2 32980 1727096604.19323: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.19338: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.19342: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.19345: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.20858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.22337: done with get_vars() 32980 1727096604.22356: variable 'ansible_search_path' from source: unknown 32980 1727096604.22372: we have included files to process 32980 1727096604.22373: generating all_blocks data 32980 1727096604.22375: done generating all_blocks data 32980 1727096604.22381: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32980 1727096604.22382: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32980 1727096604.22386: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32980 1727096604.22493: in VariableManager get_vars() 32980 1727096604.22516: done with get_vars() 32980 1727096604.22621: done processing included file 32980 1727096604.22623: iterating over new_blocks loaded from include file 32980 1727096604.22625: in VariableManager get_vars() 32980 1727096604.22641: done with get_vars() 32980 1727096604.22642: filtering new block on tags 32980 1727096604.22660: done filtering new block on tags 32980 1727096604.22662: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 32980 1727096604.22666: extending task lists for all hosts with included blocks 32980 1727096604.25016: done extending task lists 32980 1727096604.25017: done processing included files 32980 1727096604.25018: results queue empty 32980 1727096604.25019: checking for any_errors_fatal 32980 1727096604.25020: done checking for any_errors_fatal 32980 1727096604.25021: checking for max_fail_percentage 32980 1727096604.25022: done checking for max_fail_percentage 32980 1727096604.25022: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.25023: done checking to see if all hosts have failed 32980 1727096604.25024: getting the remaining hosts for this loop 32980 1727096604.25025: done getting the remaining hosts for this loop 32980 1727096604.25027: getting the next task for host managed_node2 32980 1727096604.25031: done getting next task for host managed_node2 32980 1727096604.25032: ^ task is: TASK: Include the task 'get_interface_stat.yml' 32980 1727096604.25035: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.25037: getting variables 32980 1727096604.25038: in VariableManager get_vars() 32980 1727096604.25051: Calling all_inventory to load vars for managed_node2 32980 1727096604.25053: Calling groups_inventory to load vars for managed_node2 32980 1727096604.25054: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.25059: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.25061: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.25064: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.26220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.27693: done with get_vars() 32980 1727096604.27711: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 09:03:24 -0400 (0:00:00.101) 0:00:16.204 ****** 32980 1727096604.27779: entering _queue_task() for managed_node2/include_tasks 32980 1727096604.28197: worker is 1 (out of 1 available) 32980 1727096604.28207: exiting _queue_task() for managed_node2/include_tasks 32980 1727096604.28218: done queuing things up, now waiting for results queue to drain 32980 1727096604.28219: waiting for pending results... 32980 1727096604.28402: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 32980 1727096604.28553: in run() - task 0afff68d-5257-457d-ef33-000000000578 32980 1727096604.28557: variable 'ansible_search_path' from source: unknown 32980 1727096604.28560: variable 'ansible_search_path' from source: unknown 32980 1727096604.28577: calling self._execute() 32980 1727096604.28669: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.28680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.28768: variable 'omit' from source: magic vars 32980 1727096604.29063: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.29082: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.29092: _execute() done 32980 1727096604.29104: dumping result to json 32980 1727096604.29111: done dumping result, returning 32980 1727096604.29120: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-457d-ef33-000000000578] 32980 1727096604.29128: sending task result for task 0afff68d-5257-457d-ef33-000000000578 32980 1727096604.29394: done sending task result for task 0afff68d-5257-457d-ef33-000000000578 32980 1727096604.29397: WORKER PROCESS EXITING 32980 1727096604.29422: no more pending results, returning what we have 32980 1727096604.29425: in VariableManager get_vars() 32980 1727096604.29472: Calling all_inventory to load vars for managed_node2 32980 1727096604.29474: Calling groups_inventory to load vars for managed_node2 32980 1727096604.29477: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.29489: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.29492: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.29494: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.30801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.32575: done with get_vars() 32980 1727096604.32607: variable 'ansible_search_path' from source: unknown 32980 1727096604.32608: variable 'ansible_search_path' from source: unknown 32980 1727096604.32651: we have included files to process 32980 1727096604.32653: generating all_blocks data 32980 1727096604.32654: done generating all_blocks data 32980 1727096604.32655: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32980 1727096604.32656: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32980 1727096604.32682: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32980 1727096604.32930: done processing included file 32980 1727096604.32932: iterating over new_blocks loaded from include file 32980 1727096604.32934: in VariableManager get_vars() 32980 1727096604.32954: done with get_vars() 32980 1727096604.32956: filtering new block on tags 32980 1727096604.32976: done filtering new block on tags 32980 1727096604.32979: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 32980 1727096604.32984: extending task lists for all hosts with included blocks 32980 1727096604.33139: done extending task lists 32980 1727096604.33141: done processing included files 32980 1727096604.33142: results queue empty 32980 1727096604.33142: checking for any_errors_fatal 32980 1727096604.33146: done checking for any_errors_fatal 32980 1727096604.33146: checking for max_fail_percentage 32980 1727096604.33148: done checking for max_fail_percentage 32980 1727096604.33148: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.33149: done checking to see if all hosts have failed 32980 1727096604.33150: getting the remaining hosts for this loop 32980 1727096604.33151: done getting the remaining hosts for this loop 32980 1727096604.33154: getting the next task for host managed_node2 32980 1727096604.33158: done getting next task for host managed_node2 32980 1727096604.33160: ^ task is: TASK: Get stat for interface {{ interface }} 32980 1727096604.33163: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.33165: getting variables 32980 1727096604.33166: in VariableManager get_vars() 32980 1727096604.33184: Calling all_inventory to load vars for managed_node2 32980 1727096604.33186: Calling groups_inventory to load vars for managed_node2 32980 1727096604.33188: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.33193: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.33195: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.33198: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.37864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.38740: done with get_vars() 32980 1727096604.38759: done getting variables 32980 1727096604.38914: variable 'interface' from source: include params 32980 1727096604.38918: variable 'vlan_interface' from source: play vars 32980 1727096604.38984: variable 'vlan_interface' from source: play vars TASK [Get stat for interface lsr101.90] **************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:03:24 -0400 (0:00:00.112) 0:00:16.316 ****** 32980 1727096604.39012: entering _queue_task() for managed_node2/stat 32980 1727096604.39379: worker is 1 (out of 1 available) 32980 1727096604.39392: exiting _queue_task() for managed_node2/stat 32980 1727096604.39405: done queuing things up, now waiting for results queue to drain 32980 1727096604.39407: waiting for pending results... 32980 1727096604.39705: running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr101.90 32980 1727096604.39850: in run() - task 0afff68d-5257-457d-ef33-00000000069c 32980 1727096604.39875: variable 'ansible_search_path' from source: unknown 32980 1727096604.39886: variable 'ansible_search_path' from source: unknown 32980 1727096604.39925: calling self._execute() 32980 1727096604.40049: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.40071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.40120: variable 'omit' from source: magic vars 32980 1727096604.40435: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.40445: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.40451: variable 'omit' from source: magic vars 32980 1727096604.40490: variable 'omit' from source: magic vars 32980 1727096604.40561: variable 'interface' from source: include params 32980 1727096604.40565: variable 'vlan_interface' from source: play vars 32980 1727096604.40618: variable 'vlan_interface' from source: play vars 32980 1727096604.40629: variable 'omit' from source: magic vars 32980 1727096604.40664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096604.40694: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096604.40710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096604.40725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.40735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.40757: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096604.40760: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.40762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.40836: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096604.40841: Set connection var ansible_timeout to 10 32980 1727096604.40844: Set connection var ansible_shell_type to sh 32980 1727096604.40846: Set connection var ansible_connection to ssh 32980 1727096604.40852: Set connection var ansible_shell_executable to /bin/sh 32980 1727096604.40857: Set connection var ansible_pipelining to False 32980 1727096604.40880: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.40883: variable 'ansible_connection' from source: unknown 32980 1727096604.40886: variable 'ansible_module_compression' from source: unknown 32980 1727096604.40888: variable 'ansible_shell_type' from source: unknown 32980 1727096604.40891: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.40893: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.40895: variable 'ansible_pipelining' from source: unknown 32980 1727096604.40898: variable 'ansible_timeout' from source: unknown 32980 1727096604.40900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.41049: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096604.41059: variable 'omit' from source: magic vars 32980 1727096604.41062: starting attempt loop 32980 1727096604.41065: running the handler 32980 1727096604.41080: _low_level_execute_command(): starting 32980 1727096604.41088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096604.41566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.41605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.41611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096604.41614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.41658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.41661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.41663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.41717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.43390: stdout chunk (state=3): >>>/root <<< 32980 1727096604.43492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.43516: stderr chunk (state=3): >>><<< 32980 1727096604.43520: stdout chunk (state=3): >>><<< 32980 1727096604.43542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096604.43553: _low_level_execute_command(): starting 32980 1727096604.43559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549 `" && echo ansible-tmp-1727096604.4354103-33792-221977251343549="` echo /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549 `" ) && sleep 0' 32980 1727096604.44012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.44016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096604.44027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.44029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096604.44032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096604.44034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.44070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.44074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.44080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.44114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.46037: stdout chunk (state=3): >>>ansible-tmp-1727096604.4354103-33792-221977251343549=/root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549 <<< 32980 1727096604.46159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.46163: stdout chunk (state=3): >>><<< 32980 1727096604.46173: stderr chunk (state=3): >>><<< 32980 1727096604.46193: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096604.4354103-33792-221977251343549=/root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096604.46229: variable 'ansible_module_compression' from source: unknown 32980 1727096604.46292: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32980 1727096604.46314: variable 'ansible_facts' from source: unknown 32980 1727096604.46375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/AnsiballZ_stat.py 32980 1727096604.46476: Sending initial data 32980 1727096604.46480: Sent initial data (153 bytes) 32980 1727096604.46921: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096604.46924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096604.46927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.46929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 32980 1727096604.46931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096604.46933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.46989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.46994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.46996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.47028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.48613: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096604.48640: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096604.48676: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp3u0ps_g9 /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/AnsiballZ_stat.py <<< 32980 1727096604.48680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/AnsiballZ_stat.py" <<< 32980 1727096604.48710: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp3u0ps_g9" to remote "/root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/AnsiballZ_stat.py" <<< 32980 1727096604.48715: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/AnsiballZ_stat.py" <<< 32980 1727096604.49203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.49242: stderr chunk (state=3): >>><<< 32980 1727096604.49246: stdout chunk (state=3): >>><<< 32980 1727096604.49271: done transferring module to remote 32980 1727096604.49281: _low_level_execute_command(): starting 32980 1727096604.49286: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/ /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/AnsiballZ_stat.py && sleep 0' 32980 1727096604.49731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096604.49734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096604.49742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.49745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.49748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.49797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.49804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.49807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.49838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.51666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.51675: stdout chunk (state=3): >>><<< 32980 1727096604.51678: stderr chunk (state=3): >>><<< 32980 1727096604.51698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096604.51701: _low_level_execute_command(): starting 32980 1727096604.51704: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/AnsiballZ_stat.py && sleep 0' 32980 1727096604.52132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.52137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096604.52165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.52171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.52173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.52227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.52234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.52236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.52278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.67498: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31831, "dev": 23, "nlink": 1, "atime": 1727096603.1763515, "mtime": 1727096603.1763515, "ctime": 1727096603.1763515, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32980 1727096604.68811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096604.68838: stderr chunk (state=3): >>><<< 32980 1727096604.68842: stdout chunk (state=3): >>><<< 32980 1727096604.68857: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31831, "dev": 23, "nlink": 1, "atime": 1727096603.1763515, "mtime": 1727096603.1763515, "ctime": 1727096603.1763515, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096604.68897: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096604.68905: _low_level_execute_command(): starting 32980 1727096604.68910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096604.4354103-33792-221977251343549/ > /dev/null 2>&1 && sleep 0' 32980 1727096604.69346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.69361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096604.69365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.69388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.69437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.69441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.69443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.69479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.71273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.71299: stderr chunk (state=3): >>><<< 32980 1727096604.71302: stdout chunk (state=3): >>><<< 32980 1727096604.71318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096604.71323: handler run complete 32980 1727096604.71351: attempt loop complete, returning result 32980 1727096604.71354: _execute() done 32980 1727096604.71357: dumping result to json 32980 1727096604.71362: done dumping result, returning 32980 1727096604.71370: done running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr101.90 [0afff68d-5257-457d-ef33-00000000069c] 32980 1727096604.71375: sending task result for task 0afff68d-5257-457d-ef33-00000000069c 32980 1727096604.71477: done sending task result for task 0afff68d-5257-457d-ef33-00000000069c 32980 1727096604.71479: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096603.1763515, "block_size": 4096, "blocks": 0, "ctime": 1727096603.1763515, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31831, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "mode": "0777", "mtime": 1727096603.1763515, "nlink": 1, "path": "/sys/class/net/lsr101.90", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 32980 1727096604.71557: no more pending results, returning what we have 32980 1727096604.71560: results queue empty 32980 1727096604.71561: checking for any_errors_fatal 32980 1727096604.71562: done checking for any_errors_fatal 32980 1727096604.71563: checking for max_fail_percentage 32980 1727096604.71564: done checking for max_fail_percentage 32980 1727096604.71565: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.71566: done checking to see if all hosts have failed 32980 1727096604.71566: getting the remaining hosts for this loop 32980 1727096604.71570: done getting the remaining hosts for this loop 32980 1727096604.71576: getting the next task for host managed_node2 32980 1727096604.71585: done getting next task for host managed_node2 32980 1727096604.71587: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 32980 1727096604.71589: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.71595: getting variables 32980 1727096604.71596: in VariableManager get_vars() 32980 1727096604.71634: Calling all_inventory to load vars for managed_node2 32980 1727096604.71637: Calling groups_inventory to load vars for managed_node2 32980 1727096604.71639: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.71649: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.71651: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.71653: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.72457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.73324: done with get_vars() 32980 1727096604.73339: done getting variables 32980 1727096604.73385: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096604.73476: variable 'interface' from source: include params 32980 1727096604.73479: variable 'vlan_interface' from source: play vars 32980 1727096604.73522: variable 'vlan_interface' from source: play vars TASK [Assert that the interface is present - 'lsr101.90'] ********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 09:03:24 -0400 (0:00:00.345) 0:00:16.662 ****** 32980 1727096604.73547: entering _queue_task() for managed_node2/assert 32980 1727096604.73763: worker is 1 (out of 1 available) 32980 1727096604.73779: exiting _queue_task() for managed_node2/assert 32980 1727096604.73791: done queuing things up, now waiting for results queue to drain 32980 1727096604.73792: waiting for pending results... 32980 1727096604.73959: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr101.90' 32980 1727096604.74027: in run() - task 0afff68d-5257-457d-ef33-000000000579 32980 1727096604.74037: variable 'ansible_search_path' from source: unknown 32980 1727096604.74040: variable 'ansible_search_path' from source: unknown 32980 1727096604.74066: calling self._execute() 32980 1727096604.74138: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.74142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.74151: variable 'omit' from source: magic vars 32980 1727096604.74414: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.74424: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.74429: variable 'omit' from source: magic vars 32980 1727096604.74460: variable 'omit' from source: magic vars 32980 1727096604.74525: variable 'interface' from source: include params 32980 1727096604.74529: variable 'vlan_interface' from source: play vars 32980 1727096604.74577: variable 'vlan_interface' from source: play vars 32980 1727096604.74589: variable 'omit' from source: magic vars 32980 1727096604.74622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096604.74647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096604.74664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096604.74683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.74692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.74714: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096604.74717: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.74720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.74791: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096604.74794: Set connection var ansible_timeout to 10 32980 1727096604.74797: Set connection var ansible_shell_type to sh 32980 1727096604.74799: Set connection var ansible_connection to ssh 32980 1727096604.74806: Set connection var ansible_shell_executable to /bin/sh 32980 1727096604.74809: Set connection var ansible_pipelining to False 32980 1727096604.74826: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.74829: variable 'ansible_connection' from source: unknown 32980 1727096604.74831: variable 'ansible_module_compression' from source: unknown 32980 1727096604.74834: variable 'ansible_shell_type' from source: unknown 32980 1727096604.74836: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.74838: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.74841: variable 'ansible_pipelining' from source: unknown 32980 1727096604.74843: variable 'ansible_timeout' from source: unknown 32980 1727096604.74847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.74951: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096604.74959: variable 'omit' from source: magic vars 32980 1727096604.74964: starting attempt loop 32980 1727096604.74967: running the handler 32980 1727096604.75054: variable 'interface_stat' from source: set_fact 32980 1727096604.75069: Evaluated conditional (interface_stat.stat.exists): True 32980 1727096604.75077: handler run complete 32980 1727096604.75086: attempt loop complete, returning result 32980 1727096604.75089: _execute() done 32980 1727096604.75091: dumping result to json 32980 1727096604.75094: done dumping result, returning 32980 1727096604.75101: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr101.90' [0afff68d-5257-457d-ef33-000000000579] 32980 1727096604.75105: sending task result for task 0afff68d-5257-457d-ef33-000000000579 32980 1727096604.75186: done sending task result for task 0afff68d-5257-457d-ef33-000000000579 32980 1727096604.75188: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096604.75260: no more pending results, returning what we have 32980 1727096604.75263: results queue empty 32980 1727096604.75264: checking for any_errors_fatal 32980 1727096604.75272: done checking for any_errors_fatal 32980 1727096604.75275: checking for max_fail_percentage 32980 1727096604.75277: done checking for max_fail_percentage 32980 1727096604.75277: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.75278: done checking to see if all hosts have failed 32980 1727096604.75279: getting the remaining hosts for this loop 32980 1727096604.75280: done getting the remaining hosts for this loop 32980 1727096604.75285: getting the next task for host managed_node2 32980 1727096604.75292: done getting next task for host managed_node2 32980 1727096604.75294: ^ task is: TASK: Include the task 'assert_profile_present.yml' 32980 1727096604.75296: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.75299: getting variables 32980 1727096604.75300: in VariableManager get_vars() 32980 1727096604.75335: Calling all_inventory to load vars for managed_node2 32980 1727096604.75337: Calling groups_inventory to load vars for managed_node2 32980 1727096604.75339: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.75348: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.75350: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.75352: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.76154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.77011: done with get_vars() 32980 1727096604.77025: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:50 Monday 23 September 2024 09:03:24 -0400 (0:00:00.035) 0:00:16.697 ****** 32980 1727096604.77089: entering _queue_task() for managed_node2/include_tasks 32980 1727096604.77289: worker is 1 (out of 1 available) 32980 1727096604.77302: exiting _queue_task() for managed_node2/include_tasks 32980 1727096604.77314: done queuing things up, now waiting for results queue to drain 32980 1727096604.77315: waiting for pending results... 32980 1727096604.77477: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 32980 1727096604.77536: in run() - task 0afff68d-5257-457d-ef33-00000000005c 32980 1727096604.77551: variable 'ansible_search_path' from source: unknown 32980 1727096604.77586: variable 'interface' from source: play vars 32980 1727096604.77724: variable 'interface' from source: play vars 32980 1727096604.77736: variable 'vlan_interface' from source: play vars 32980 1727096604.77786: variable 'vlan_interface' from source: play vars 32980 1727096604.77797: variable 'omit' from source: magic vars 32980 1727096604.77891: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.77898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.77908: variable 'omit' from source: magic vars 32980 1727096604.78071: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.78079: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.78102: variable 'item' from source: unknown 32980 1727096604.78146: variable 'item' from source: unknown 32980 1727096604.78271: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.78277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.78280: variable 'omit' from source: magic vars 32980 1727096604.78346: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.78349: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.78369: variable 'item' from source: unknown 32980 1727096604.78414: variable 'item' from source: unknown 32980 1727096604.78482: dumping result to json 32980 1727096604.78484: done dumping result, returning 32980 1727096604.78487: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-457d-ef33-00000000005c] 32980 1727096604.78489: sending task result for task 0afff68d-5257-457d-ef33-00000000005c 32980 1727096604.78525: done sending task result for task 0afff68d-5257-457d-ef33-00000000005c 32980 1727096604.78527: WORKER PROCESS EXITING 32980 1727096604.78550: no more pending results, returning what we have 32980 1727096604.78554: in VariableManager get_vars() 32980 1727096604.78600: Calling all_inventory to load vars for managed_node2 32980 1727096604.78603: Calling groups_inventory to load vars for managed_node2 32980 1727096604.78605: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.78615: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.78617: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.78620: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.79370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.80321: done with get_vars() 32980 1727096604.80334: variable 'ansible_search_path' from source: unknown 32980 1727096604.80345: variable 'ansible_search_path' from source: unknown 32980 1727096604.80352: we have included files to process 32980 1727096604.80353: generating all_blocks data 32980 1727096604.80354: done generating all_blocks data 32980 1727096604.80357: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32980 1727096604.80358: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32980 1727096604.80359: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32980 1727096604.80486: in VariableManager get_vars() 32980 1727096604.80500: done with get_vars() 32980 1727096604.80661: done processing included file 32980 1727096604.80662: iterating over new_blocks loaded from include file 32980 1727096604.80663: in VariableManager get_vars() 32980 1727096604.80680: done with get_vars() 32980 1727096604.80681: filtering new block on tags 32980 1727096604.80694: done filtering new block on tags 32980 1727096604.80695: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=lsr101) 32980 1727096604.80698: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32980 1727096604.80699: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32980 1727096604.80701: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32980 1727096604.80756: in VariableManager get_vars() 32980 1727096604.80772: done with get_vars() 32980 1727096604.80923: done processing included file 32980 1727096604.80924: iterating over new_blocks loaded from include file 32980 1727096604.80925: in VariableManager get_vars() 32980 1727096604.80936: done with get_vars() 32980 1727096604.80937: filtering new block on tags 32980 1727096604.80948: done filtering new block on tags 32980 1727096604.80949: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=lsr101.90) 32980 1727096604.80951: extending task lists for all hosts with included blocks 32980 1727096604.82375: done extending task lists 32980 1727096604.82377: done processing included files 32980 1727096604.82377: results queue empty 32980 1727096604.82378: checking for any_errors_fatal 32980 1727096604.82380: done checking for any_errors_fatal 32980 1727096604.82380: checking for max_fail_percentage 32980 1727096604.82381: done checking for max_fail_percentage 32980 1727096604.82381: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.82382: done checking to see if all hosts have failed 32980 1727096604.82382: getting the remaining hosts for this loop 32980 1727096604.82383: done getting the remaining hosts for this loop 32980 1727096604.82384: getting the next task for host managed_node2 32980 1727096604.82387: done getting next task for host managed_node2 32980 1727096604.82388: ^ task is: TASK: Include the task 'get_profile_stat.yml' 32980 1727096604.82390: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.82391: getting variables 32980 1727096604.82392: in VariableManager get_vars() 32980 1727096604.82400: Calling all_inventory to load vars for managed_node2 32980 1727096604.82401: Calling groups_inventory to load vars for managed_node2 32980 1727096604.82403: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.82406: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.82407: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.82415: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.83018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.83865: done with get_vars() 32980 1727096604.83881: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 09:03:24 -0400 (0:00:00.068) 0:00:16.765 ****** 32980 1727096604.83930: entering _queue_task() for managed_node2/include_tasks 32980 1727096604.84137: worker is 1 (out of 1 available) 32980 1727096604.84149: exiting _queue_task() for managed_node2/include_tasks 32980 1727096604.84161: done queuing things up, now waiting for results queue to drain 32980 1727096604.84163: waiting for pending results... 32980 1727096604.84335: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 32980 1727096604.84399: in run() - task 0afff68d-5257-457d-ef33-0000000006b8 32980 1727096604.84409: variable 'ansible_search_path' from source: unknown 32980 1727096604.84412: variable 'ansible_search_path' from source: unknown 32980 1727096604.84440: calling self._execute() 32980 1727096604.84519: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.84522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.84532: variable 'omit' from source: magic vars 32980 1727096604.84801: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.84810: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.84816: _execute() done 32980 1727096604.84821: dumping result to json 32980 1727096604.84824: done dumping result, returning 32980 1727096604.84827: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-457d-ef33-0000000006b8] 32980 1727096604.84837: sending task result for task 0afff68d-5257-457d-ef33-0000000006b8 32980 1727096604.84911: done sending task result for task 0afff68d-5257-457d-ef33-0000000006b8 32980 1727096604.84914: WORKER PROCESS EXITING 32980 1727096604.84965: no more pending results, returning what we have 32980 1727096604.84971: in VariableManager get_vars() 32980 1727096604.85009: Calling all_inventory to load vars for managed_node2 32980 1727096604.85011: Calling groups_inventory to load vars for managed_node2 32980 1727096604.85013: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.85022: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.85025: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.85027: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.85842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.86696: done with get_vars() 32980 1727096604.86709: variable 'ansible_search_path' from source: unknown 32980 1727096604.86710: variable 'ansible_search_path' from source: unknown 32980 1727096604.86734: we have included files to process 32980 1727096604.86735: generating all_blocks data 32980 1727096604.86736: done generating all_blocks data 32980 1727096604.86737: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32980 1727096604.86737: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32980 1727096604.86739: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32980 1727096604.87400: done processing included file 32980 1727096604.87401: iterating over new_blocks loaded from include file 32980 1727096604.87402: in VariableManager get_vars() 32980 1727096604.87414: done with get_vars() 32980 1727096604.87415: filtering new block on tags 32980 1727096604.87428: done filtering new block on tags 32980 1727096604.87430: in VariableManager get_vars() 32980 1727096604.87440: done with get_vars() 32980 1727096604.87441: filtering new block on tags 32980 1727096604.87454: done filtering new block on tags 32980 1727096604.87455: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 32980 1727096604.87458: extending task lists for all hosts with included blocks 32980 1727096604.87552: done extending task lists 32980 1727096604.87553: done processing included files 32980 1727096604.87553: results queue empty 32980 1727096604.87554: checking for any_errors_fatal 32980 1727096604.87556: done checking for any_errors_fatal 32980 1727096604.87556: checking for max_fail_percentage 32980 1727096604.87557: done checking for max_fail_percentage 32980 1727096604.87557: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.87558: done checking to see if all hosts have failed 32980 1727096604.87558: getting the remaining hosts for this loop 32980 1727096604.87559: done getting the remaining hosts for this loop 32980 1727096604.87561: getting the next task for host managed_node2 32980 1727096604.87563: done getting next task for host managed_node2 32980 1727096604.87564: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 32980 1727096604.87566: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.87570: getting variables 32980 1727096604.87571: in VariableManager get_vars() 32980 1727096604.87615: Calling all_inventory to load vars for managed_node2 32980 1727096604.87617: Calling groups_inventory to load vars for managed_node2 32980 1727096604.87618: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.87622: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.87623: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.87625: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.88226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.89112: done with get_vars() 32980 1727096604.89126: done getting variables 32980 1727096604.89151: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 09:03:24 -0400 (0:00:00.052) 0:00:16.818 ****** 32980 1727096604.89173: entering _queue_task() for managed_node2/set_fact 32980 1727096604.89399: worker is 1 (out of 1 available) 32980 1727096604.89412: exiting _queue_task() for managed_node2/set_fact 32980 1727096604.89425: done queuing things up, now waiting for results queue to drain 32980 1727096604.89426: waiting for pending results... 32980 1727096604.89597: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 32980 1727096604.89669: in run() - task 0afff68d-5257-457d-ef33-0000000007f0 32980 1727096604.89683: variable 'ansible_search_path' from source: unknown 32980 1727096604.89686: variable 'ansible_search_path' from source: unknown 32980 1727096604.89713: calling self._execute() 32980 1727096604.89784: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.89788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.89796: variable 'omit' from source: magic vars 32980 1727096604.90064: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.90075: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.90083: variable 'omit' from source: magic vars 32980 1727096604.90115: variable 'omit' from source: magic vars 32980 1727096604.90139: variable 'omit' from source: magic vars 32980 1727096604.90173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096604.90206: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096604.90219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096604.90233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.90243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.90268: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096604.90271: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.90274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.90348: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096604.90351: Set connection var ansible_timeout to 10 32980 1727096604.90354: Set connection var ansible_shell_type to sh 32980 1727096604.90356: Set connection var ansible_connection to ssh 32980 1727096604.90363: Set connection var ansible_shell_executable to /bin/sh 32980 1727096604.90369: Set connection var ansible_pipelining to False 32980 1727096604.90388: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.90391: variable 'ansible_connection' from source: unknown 32980 1727096604.90394: variable 'ansible_module_compression' from source: unknown 32980 1727096604.90397: variable 'ansible_shell_type' from source: unknown 32980 1727096604.90399: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.90402: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.90404: variable 'ansible_pipelining' from source: unknown 32980 1727096604.90406: variable 'ansible_timeout' from source: unknown 32980 1727096604.90409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.90509: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096604.90519: variable 'omit' from source: magic vars 32980 1727096604.90526: starting attempt loop 32980 1727096604.90529: running the handler 32980 1727096604.90541: handler run complete 32980 1727096604.90549: attempt loop complete, returning result 32980 1727096604.90551: _execute() done 32980 1727096604.90554: dumping result to json 32980 1727096604.90556: done dumping result, returning 32980 1727096604.90562: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-457d-ef33-0000000007f0] 32980 1727096604.90566: sending task result for task 0afff68d-5257-457d-ef33-0000000007f0 32980 1727096604.90644: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f0 32980 1727096604.90647: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 32980 1727096604.90702: no more pending results, returning what we have 32980 1727096604.90705: results queue empty 32980 1727096604.90706: checking for any_errors_fatal 32980 1727096604.90707: done checking for any_errors_fatal 32980 1727096604.90708: checking for max_fail_percentage 32980 1727096604.90709: done checking for max_fail_percentage 32980 1727096604.90710: checking to see if all hosts have failed and the running result is not ok 32980 1727096604.90711: done checking to see if all hosts have failed 32980 1727096604.90711: getting the remaining hosts for this loop 32980 1727096604.90713: done getting the remaining hosts for this loop 32980 1727096604.90716: getting the next task for host managed_node2 32980 1727096604.90724: done getting next task for host managed_node2 32980 1727096604.90726: ^ task is: TASK: Stat profile file 32980 1727096604.90729: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096604.90732: getting variables 32980 1727096604.90734: in VariableManager get_vars() 32980 1727096604.90771: Calling all_inventory to load vars for managed_node2 32980 1727096604.90773: Calling groups_inventory to load vars for managed_node2 32980 1727096604.90776: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096604.90785: Calling all_plugins_play to load vars for managed_node2 32980 1727096604.90787: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096604.90789: Calling groups_plugins_play to load vars for managed_node2 32980 1727096604.91530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096604.92386: done with get_vars() 32980 1727096604.92400: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 09:03:24 -0400 (0:00:00.032) 0:00:16.851 ****** 32980 1727096604.92458: entering _queue_task() for managed_node2/stat 32980 1727096604.92669: worker is 1 (out of 1 available) 32980 1727096604.92683: exiting _queue_task() for managed_node2/stat 32980 1727096604.92695: done queuing things up, now waiting for results queue to drain 32980 1727096604.92696: waiting for pending results... 32980 1727096604.92858: running TaskExecutor() for managed_node2/TASK: Stat profile file 32980 1727096604.92925: in run() - task 0afff68d-5257-457d-ef33-0000000007f1 32980 1727096604.92943: variable 'ansible_search_path' from source: unknown 32980 1727096604.92948: variable 'ansible_search_path' from source: unknown 32980 1727096604.92979: calling self._execute() 32980 1727096604.93053: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.93058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.93066: variable 'omit' from source: magic vars 32980 1727096604.93338: variable 'ansible_distribution_major_version' from source: facts 32980 1727096604.93348: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096604.93356: variable 'omit' from source: magic vars 32980 1727096604.93391: variable 'omit' from source: magic vars 32980 1727096604.93461: variable 'profile' from source: include params 32980 1727096604.93466: variable 'item' from source: include params 32980 1727096604.93517: variable 'item' from source: include params 32980 1727096604.93531: variable 'omit' from source: magic vars 32980 1727096604.93565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096604.93598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096604.93615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096604.93628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.93638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096604.93663: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096604.93666: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.93671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.93742: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096604.93746: Set connection var ansible_timeout to 10 32980 1727096604.93748: Set connection var ansible_shell_type to sh 32980 1727096604.93751: Set connection var ansible_connection to ssh 32980 1727096604.93758: Set connection var ansible_shell_executable to /bin/sh 32980 1727096604.93762: Set connection var ansible_pipelining to False 32980 1727096604.93781: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.93785: variable 'ansible_connection' from source: unknown 32980 1727096604.93787: variable 'ansible_module_compression' from source: unknown 32980 1727096604.93790: variable 'ansible_shell_type' from source: unknown 32980 1727096604.93793: variable 'ansible_shell_executable' from source: unknown 32980 1727096604.93796: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096604.93799: variable 'ansible_pipelining' from source: unknown 32980 1727096604.93802: variable 'ansible_timeout' from source: unknown 32980 1727096604.93804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096604.93950: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096604.93957: variable 'omit' from source: magic vars 32980 1727096604.93963: starting attempt loop 32980 1727096604.93965: running the handler 32980 1727096604.93980: _low_level_execute_command(): starting 32980 1727096604.93988: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096604.94506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096604.94510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096604.94514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096604.94517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.94565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.94572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.94574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.94622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.96307: stdout chunk (state=3): >>>/root <<< 32980 1727096604.96403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.96437: stderr chunk (state=3): >>><<< 32980 1727096604.96441: stdout chunk (state=3): >>><<< 32980 1727096604.96460: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096604.96473: _low_level_execute_command(): starting 32980 1727096604.96482: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399 `" && echo ansible-tmp-1727096604.964604-33807-279943465038399="` echo /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399 `" ) && sleep 0' 32980 1727096604.96929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096604.96933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096604.96943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.96946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.96948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.96986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.97003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.97036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096604.98945: stdout chunk (state=3): >>>ansible-tmp-1727096604.964604-33807-279943465038399=/root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399 <<< 32980 1727096604.99048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096604.99079: stderr chunk (state=3): >>><<< 32980 1727096604.99083: stdout chunk (state=3): >>><<< 32980 1727096604.99099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096604.964604-33807-279943465038399=/root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096604.99140: variable 'ansible_module_compression' from source: unknown 32980 1727096604.99186: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32980 1727096604.99220: variable 'ansible_facts' from source: unknown 32980 1727096604.99281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/AnsiballZ_stat.py 32980 1727096604.99383: Sending initial data 32980 1727096604.99386: Sent initial data (152 bytes) 32980 1727096604.99835: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.99838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096604.99841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.99843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096604.99845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096604.99902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096604.99909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096604.99911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096604.99942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.01527: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096605.01556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096605.01594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpof1sz5o0 /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/AnsiballZ_stat.py <<< 32980 1727096605.01597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/AnsiballZ_stat.py" <<< 32980 1727096605.01623: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpof1sz5o0" to remote "/root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/AnsiballZ_stat.py" <<< 32980 1727096605.01630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/AnsiballZ_stat.py" <<< 32980 1727096605.02104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.02152: stderr chunk (state=3): >>><<< 32980 1727096605.02155: stdout chunk (state=3): >>><<< 32980 1727096605.02198: done transferring module to remote 32980 1727096605.02207: _low_level_execute_command(): starting 32980 1727096605.02212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/ /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/AnsiballZ_stat.py && sleep 0' 32980 1727096605.02666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.02672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096605.02675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.02677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.02683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.02732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096605.02735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.02741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.02775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.04625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.04653: stderr chunk (state=3): >>><<< 32980 1727096605.04657: stdout chunk (state=3): >>><<< 32980 1727096605.04675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096605.04680: _low_level_execute_command(): starting 32980 1727096605.04685: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/AnsiballZ_stat.py && sleep 0' 32980 1727096605.05134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096605.05138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096605.05140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096605.05142: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.05145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.05203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.05208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.05245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.20803: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32980 1727096605.22170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096605.22198: stderr chunk (state=3): >>><<< 32980 1727096605.22202: stdout chunk (state=3): >>><<< 32980 1727096605.22217: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096605.22241: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096605.22251: _low_level_execute_command(): starting 32980 1727096605.22256: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096604.964604-33807-279943465038399/ > /dev/null 2>&1 && sleep 0' 32980 1727096605.22704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.22708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.22728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.22773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096605.22778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.22793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.22829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.24689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.24715: stderr chunk (state=3): >>><<< 32980 1727096605.24718: stdout chunk (state=3): >>><<< 32980 1727096605.24732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096605.24738: handler run complete 32980 1727096605.24760: attempt loop complete, returning result 32980 1727096605.24763: _execute() done 32980 1727096605.24766: dumping result to json 32980 1727096605.24769: done dumping result, returning 32980 1727096605.24778: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0afff68d-5257-457d-ef33-0000000007f1] 32980 1727096605.24782: sending task result for task 0afff68d-5257-457d-ef33-0000000007f1 32980 1727096605.24872: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f1 32980 1727096605.24877: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 32980 1727096605.24928: no more pending results, returning what we have 32980 1727096605.24931: results queue empty 32980 1727096605.24932: checking for any_errors_fatal 32980 1727096605.24941: done checking for any_errors_fatal 32980 1727096605.24942: checking for max_fail_percentage 32980 1727096605.24944: done checking for max_fail_percentage 32980 1727096605.24945: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.24946: done checking to see if all hosts have failed 32980 1727096605.24946: getting the remaining hosts for this loop 32980 1727096605.24948: done getting the remaining hosts for this loop 32980 1727096605.24951: getting the next task for host managed_node2 32980 1727096605.24959: done getting next task for host managed_node2 32980 1727096605.24961: ^ task is: TASK: Set NM profile exist flag based on the profile files 32980 1727096605.24966: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.24975: getting variables 32980 1727096605.24977: in VariableManager get_vars() 32980 1727096605.25020: Calling all_inventory to load vars for managed_node2 32980 1727096605.25022: Calling groups_inventory to load vars for managed_node2 32980 1727096605.25025: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.25036: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.25038: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.25041: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.26021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.26881: done with get_vars() 32980 1727096605.26898: done getting variables 32980 1727096605.26944: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 09:03:25 -0400 (0:00:00.345) 0:00:17.196 ****** 32980 1727096605.26966: entering _queue_task() for managed_node2/set_fact 32980 1727096605.27223: worker is 1 (out of 1 available) 32980 1727096605.27236: exiting _queue_task() for managed_node2/set_fact 32980 1727096605.27248: done queuing things up, now waiting for results queue to drain 32980 1727096605.27250: waiting for pending results... 32980 1727096605.27422: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 32980 1727096605.27500: in run() - task 0afff68d-5257-457d-ef33-0000000007f2 32980 1727096605.27511: variable 'ansible_search_path' from source: unknown 32980 1727096605.27515: variable 'ansible_search_path' from source: unknown 32980 1727096605.27542: calling self._execute() 32980 1727096605.27619: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.27622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.27631: variable 'omit' from source: magic vars 32980 1727096605.27904: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.27917: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.27998: variable 'profile_stat' from source: set_fact 32980 1727096605.28008: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096605.28011: when evaluation is False, skipping this task 32980 1727096605.28014: _execute() done 32980 1727096605.28025: dumping result to json 32980 1727096605.28028: done dumping result, returning 32980 1727096605.28031: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-457d-ef33-0000000007f2] 32980 1727096605.28033: sending task result for task 0afff68d-5257-457d-ef33-0000000007f2 32980 1727096605.28111: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f2 32980 1727096605.28114: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096605.28189: no more pending results, returning what we have 32980 1727096605.28192: results queue empty 32980 1727096605.28194: checking for any_errors_fatal 32980 1727096605.28202: done checking for any_errors_fatal 32980 1727096605.28202: checking for max_fail_percentage 32980 1727096605.28204: done checking for max_fail_percentage 32980 1727096605.28205: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.28205: done checking to see if all hosts have failed 32980 1727096605.28206: getting the remaining hosts for this loop 32980 1727096605.28208: done getting the remaining hosts for this loop 32980 1727096605.28211: getting the next task for host managed_node2 32980 1727096605.28218: done getting next task for host managed_node2 32980 1727096605.28220: ^ task is: TASK: Get NM profile info 32980 1727096605.28224: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.28228: getting variables 32980 1727096605.28229: in VariableManager get_vars() 32980 1727096605.28272: Calling all_inventory to load vars for managed_node2 32980 1727096605.28277: Calling groups_inventory to load vars for managed_node2 32980 1727096605.28279: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.28288: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.28291: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.28293: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.29069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.29937: done with get_vars() 32980 1727096605.29953: done getting variables 32980 1727096605.30027: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 09:03:25 -0400 (0:00:00.030) 0:00:17.227 ****** 32980 1727096605.30048: entering _queue_task() for managed_node2/shell 32980 1727096605.30049: Creating lock for shell 32980 1727096605.30294: worker is 1 (out of 1 available) 32980 1727096605.30309: exiting _queue_task() for managed_node2/shell 32980 1727096605.30320: done queuing things up, now waiting for results queue to drain 32980 1727096605.30321: waiting for pending results... 32980 1727096605.30491: running TaskExecutor() for managed_node2/TASK: Get NM profile info 32980 1727096605.30569: in run() - task 0afff68d-5257-457d-ef33-0000000007f3 32980 1727096605.30581: variable 'ansible_search_path' from source: unknown 32980 1727096605.30585: variable 'ansible_search_path' from source: unknown 32980 1727096605.30612: calling self._execute() 32980 1727096605.30683: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.30688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.30695: variable 'omit' from source: magic vars 32980 1727096605.30964: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.30980: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.30984: variable 'omit' from source: magic vars 32980 1727096605.31012: variable 'omit' from source: magic vars 32980 1727096605.31080: variable 'profile' from source: include params 32980 1727096605.31084: variable 'item' from source: include params 32980 1727096605.31132: variable 'item' from source: include params 32980 1727096605.31146: variable 'omit' from source: magic vars 32980 1727096605.31183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096605.31213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096605.31227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096605.31240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096605.31249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096605.31276: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096605.31279: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.31281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.31350: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096605.31353: Set connection var ansible_timeout to 10 32980 1727096605.31356: Set connection var ansible_shell_type to sh 32980 1727096605.31358: Set connection var ansible_connection to ssh 32980 1727096605.31366: Set connection var ansible_shell_executable to /bin/sh 32980 1727096605.31371: Set connection var ansible_pipelining to False 32980 1727096605.31390: variable 'ansible_shell_executable' from source: unknown 32980 1727096605.31392: variable 'ansible_connection' from source: unknown 32980 1727096605.31395: variable 'ansible_module_compression' from source: unknown 32980 1727096605.31397: variable 'ansible_shell_type' from source: unknown 32980 1727096605.31399: variable 'ansible_shell_executable' from source: unknown 32980 1727096605.31402: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.31404: variable 'ansible_pipelining' from source: unknown 32980 1727096605.31407: variable 'ansible_timeout' from source: unknown 32980 1727096605.31411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.31512: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096605.31520: variable 'omit' from source: magic vars 32980 1727096605.31525: starting attempt loop 32980 1727096605.31530: running the handler 32980 1727096605.31540: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096605.31554: _low_level_execute_command(): starting 32980 1727096605.31561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096605.32050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.32094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096605.32097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.32100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096605.32102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096605.32105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.32148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096605.32152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.32154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.32224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.33916: stdout chunk (state=3): >>>/root <<< 32980 1727096605.34014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.34049: stderr chunk (state=3): >>><<< 32980 1727096605.34052: stdout chunk (state=3): >>><<< 32980 1727096605.34080: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096605.34089: _low_level_execute_command(): starting 32980 1727096605.34096: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838 `" && echo ansible-tmp-1727096605.3407726-33823-161630227101838="` echo /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838 `" ) && sleep 0' 32980 1727096605.34624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096605.34627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096605.34630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.34633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096605.34635: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.34673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096605.34694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.34701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.34727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.36655: stdout chunk (state=3): >>>ansible-tmp-1727096605.3407726-33823-161630227101838=/root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838 <<< 32980 1727096605.36765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.36794: stderr chunk (state=3): >>><<< 32980 1727096605.36798: stdout chunk (state=3): >>><<< 32980 1727096605.36813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096605.3407726-33823-161630227101838=/root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096605.36840: variable 'ansible_module_compression' from source: unknown 32980 1727096605.36883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096605.36914: variable 'ansible_facts' from source: unknown 32980 1727096605.36966: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/AnsiballZ_command.py 32980 1727096605.37064: Sending initial data 32980 1727096605.37069: Sent initial data (156 bytes) 32980 1727096605.37530: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.37533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096605.37536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.37539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.37541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.37588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096605.37591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.37597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.37629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.39196: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32980 1727096605.39199: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096605.39225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096605.39259: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpo9cjmc4b /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/AnsiballZ_command.py <<< 32980 1727096605.39264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/AnsiballZ_command.py" <<< 32980 1727096605.39295: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpo9cjmc4b" to remote "/root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/AnsiballZ_command.py" <<< 32980 1727096605.39787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.39830: stderr chunk (state=3): >>><<< 32980 1727096605.39834: stdout chunk (state=3): >>><<< 32980 1727096605.39875: done transferring module to remote 32980 1727096605.39886: _low_level_execute_command(): starting 32980 1727096605.39891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/ /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/AnsiballZ_command.py && sleep 0' 32980 1727096605.40347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096605.40355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096605.40358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.40360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.40362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096605.40364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.40404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096605.40407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.40412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.40446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.42261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.42294: stderr chunk (state=3): >>><<< 32980 1727096605.42297: stdout chunk (state=3): >>><<< 32980 1727096605.42311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096605.42314: _low_level_execute_command(): starting 32980 1727096605.42319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/AnsiballZ_command.py && sleep 0' 32980 1727096605.42736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096605.42740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096605.42772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096605.42778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096605.42780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.42782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.42829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096605.42833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.42842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.42891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.60186: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-23 09:03:25.581770", "end": "2024-09-23 09:03:25.600573", "delta": "0:00:00.018803", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096605.61823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096605.61847: stderr chunk (state=3): >>><<< 32980 1727096605.61851: stdout chunk (state=3): >>><<< 32980 1727096605.61871: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-23 09:03:25.581770", "end": "2024-09-23 09:03:25.600573", "delta": "0:00:00.018803", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096605.61899: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096605.61908: _low_level_execute_command(): starting 32980 1727096605.61911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096605.3407726-33823-161630227101838/ > /dev/null 2>&1 && sleep 0' 32980 1727096605.62342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096605.62346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096605.62408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096605.62463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096605.62484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096605.62548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096605.64786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096605.64790: stdout chunk (state=3): >>><<< 32980 1727096605.64793: stderr chunk (state=3): >>><<< 32980 1727096605.64795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096605.64797: handler run complete 32980 1727096605.64799: Evaluated conditional (False): False 32980 1727096605.64801: attempt loop complete, returning result 32980 1727096605.64803: _execute() done 32980 1727096605.64805: dumping result to json 32980 1727096605.64807: done dumping result, returning 32980 1727096605.64809: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0afff68d-5257-457d-ef33-0000000007f3] 32980 1727096605.64816: sending task result for task 0afff68d-5257-457d-ef33-0000000007f3 ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "delta": "0:00:00.018803", "end": "2024-09-23 09:03:25.600573", "rc": 0, "start": "2024-09-23 09:03:25.581770" } STDOUT: lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 32980 1727096605.65075: no more pending results, returning what we have 32980 1727096605.65079: results queue empty 32980 1727096605.65080: checking for any_errors_fatal 32980 1727096605.65087: done checking for any_errors_fatal 32980 1727096605.65088: checking for max_fail_percentage 32980 1727096605.65090: done checking for max_fail_percentage 32980 1727096605.65091: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.65092: done checking to see if all hosts have failed 32980 1727096605.65093: getting the remaining hosts for this loop 32980 1727096605.65094: done getting the remaining hosts for this loop 32980 1727096605.65098: getting the next task for host managed_node2 32980 1727096605.65110: done getting next task for host managed_node2 32980 1727096605.65113: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32980 1727096605.65118: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.65122: getting variables 32980 1727096605.65124: in VariableManager get_vars() 32980 1727096605.65283: Calling all_inventory to load vars for managed_node2 32980 1727096605.65288: Calling groups_inventory to load vars for managed_node2 32980 1727096605.65291: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.65297: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f3 32980 1727096605.65300: WORKER PROCESS EXITING 32980 1727096605.65312: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.65315: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.65318: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.67090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.68695: done with get_vars() 32980 1727096605.68726: done getting variables 32980 1727096605.68796: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 09:03:25 -0400 (0:00:00.387) 0:00:17.614 ****** 32980 1727096605.68822: entering _queue_task() for managed_node2/set_fact 32980 1727096605.69085: worker is 1 (out of 1 available) 32980 1727096605.69098: exiting _queue_task() for managed_node2/set_fact 32980 1727096605.69109: done queuing things up, now waiting for results queue to drain 32980 1727096605.69111: waiting for pending results... 32980 1727096605.69290: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32980 1727096605.69363: in run() - task 0afff68d-5257-457d-ef33-0000000007f4 32980 1727096605.69379: variable 'ansible_search_path' from source: unknown 32980 1727096605.69382: variable 'ansible_search_path' from source: unknown 32980 1727096605.69409: calling self._execute() 32980 1727096605.69482: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.69486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.69496: variable 'omit' from source: magic vars 32980 1727096605.69769: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.69783: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.69866: variable 'nm_profile_exists' from source: set_fact 32980 1727096605.69883: Evaluated conditional (nm_profile_exists.rc == 0): True 32980 1727096605.69886: variable 'omit' from source: magic vars 32980 1727096605.69918: variable 'omit' from source: magic vars 32980 1727096605.69939: variable 'omit' from source: magic vars 32980 1727096605.69972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096605.70005: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096605.70018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096605.70032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096605.70041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096605.70063: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096605.70066: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.70072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.70145: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096605.70148: Set connection var ansible_timeout to 10 32980 1727096605.70150: Set connection var ansible_shell_type to sh 32980 1727096605.70153: Set connection var ansible_connection to ssh 32980 1727096605.70160: Set connection var ansible_shell_executable to /bin/sh 32980 1727096605.70164: Set connection var ansible_pipelining to False 32980 1727096605.70183: variable 'ansible_shell_executable' from source: unknown 32980 1727096605.70186: variable 'ansible_connection' from source: unknown 32980 1727096605.70189: variable 'ansible_module_compression' from source: unknown 32980 1727096605.70191: variable 'ansible_shell_type' from source: unknown 32980 1727096605.70193: variable 'ansible_shell_executable' from source: unknown 32980 1727096605.70195: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.70197: variable 'ansible_pipelining' from source: unknown 32980 1727096605.70201: variable 'ansible_timeout' from source: unknown 32980 1727096605.70205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.70308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096605.70330: variable 'omit' from source: magic vars 32980 1727096605.70335: starting attempt loop 32980 1727096605.70337: running the handler 32980 1727096605.70346: handler run complete 32980 1727096605.70354: attempt loop complete, returning result 32980 1727096605.70357: _execute() done 32980 1727096605.70359: dumping result to json 32980 1727096605.70362: done dumping result, returning 32980 1727096605.70370: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-457d-ef33-0000000007f4] 32980 1727096605.70375: sending task result for task 0afff68d-5257-457d-ef33-0000000007f4 32980 1727096605.70451: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f4 32980 1727096605.70454: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 32980 1727096605.70516: no more pending results, returning what we have 32980 1727096605.70518: results queue empty 32980 1727096605.70519: checking for any_errors_fatal 32980 1727096605.70530: done checking for any_errors_fatal 32980 1727096605.70530: checking for max_fail_percentage 32980 1727096605.70532: done checking for max_fail_percentage 32980 1727096605.70533: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.70534: done checking to see if all hosts have failed 32980 1727096605.70534: getting the remaining hosts for this loop 32980 1727096605.70536: done getting the remaining hosts for this loop 32980 1727096605.70540: getting the next task for host managed_node2 32980 1727096605.70550: done getting next task for host managed_node2 32980 1727096605.70552: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 32980 1727096605.70556: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.70560: getting variables 32980 1727096605.70562: in VariableManager get_vars() 32980 1727096605.70618: Calling all_inventory to load vars for managed_node2 32980 1727096605.70621: Calling groups_inventory to load vars for managed_node2 32980 1727096605.70623: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.70632: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.70635: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.70637: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.71913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.72893: done with get_vars() 32980 1727096605.72909: done getting variables 32980 1727096605.72952: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096605.73039: variable 'profile' from source: include params 32980 1727096605.73042: variable 'item' from source: include params 32980 1727096605.73089: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 09:03:25 -0400 (0:00:00.042) 0:00:17.657 ****** 32980 1727096605.73116: entering _queue_task() for managed_node2/command 32980 1727096605.73355: worker is 1 (out of 1 available) 32980 1727096605.73375: exiting _queue_task() for managed_node2/command 32980 1727096605.73387: done queuing things up, now waiting for results queue to drain 32980 1727096605.73388: waiting for pending results... 32980 1727096605.73554: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr101 32980 1727096605.73637: in run() - task 0afff68d-5257-457d-ef33-0000000007f6 32980 1727096605.73648: variable 'ansible_search_path' from source: unknown 32980 1727096605.73652: variable 'ansible_search_path' from source: unknown 32980 1727096605.73682: calling self._execute() 32980 1727096605.73763: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.73768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.73780: variable 'omit' from source: magic vars 32980 1727096605.74122: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.74187: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.74247: variable 'profile_stat' from source: set_fact 32980 1727096605.74261: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096605.74264: when evaluation is False, skipping this task 32980 1727096605.74269: _execute() done 32980 1727096605.74272: dumping result to json 32980 1727096605.74278: done dumping result, returning 32980 1727096605.74280: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr101 [0afff68d-5257-457d-ef33-0000000007f6] 32980 1727096605.74285: sending task result for task 0afff68d-5257-457d-ef33-0000000007f6 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096605.74449: no more pending results, returning what we have 32980 1727096605.74453: results queue empty 32980 1727096605.74454: checking for any_errors_fatal 32980 1727096605.74461: done checking for any_errors_fatal 32980 1727096605.74462: checking for max_fail_percentage 32980 1727096605.74463: done checking for max_fail_percentage 32980 1727096605.74464: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.74465: done checking to see if all hosts have failed 32980 1727096605.74466: getting the remaining hosts for this loop 32980 1727096605.74469: done getting the remaining hosts for this loop 32980 1727096605.74475: getting the next task for host managed_node2 32980 1727096605.74483: done getting next task for host managed_node2 32980 1727096605.74485: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 32980 1727096605.74490: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.74494: getting variables 32980 1727096605.74496: in VariableManager get_vars() 32980 1727096605.74651: Calling all_inventory to load vars for managed_node2 32980 1727096605.74654: Calling groups_inventory to load vars for managed_node2 32980 1727096605.74656: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.74666: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.74671: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.74678: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.75197: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f6 32980 1727096605.75201: WORKER PROCESS EXITING 32980 1727096605.76003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.77554: done with get_vars() 32980 1727096605.77587: done getting variables 32980 1727096605.77649: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096605.77760: variable 'profile' from source: include params 32980 1727096605.77764: variable 'item' from source: include params 32980 1727096605.77822: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101] ********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 09:03:25 -0400 (0:00:00.047) 0:00:17.705 ****** 32980 1727096605.77854: entering _queue_task() for managed_node2/set_fact 32980 1727096605.78192: worker is 1 (out of 1 available) 32980 1727096605.78206: exiting _queue_task() for managed_node2/set_fact 32980 1727096605.78218: done queuing things up, now waiting for results queue to drain 32980 1727096605.78220: waiting for pending results... 32980 1727096605.78686: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101 32980 1727096605.78691: in run() - task 0afff68d-5257-457d-ef33-0000000007f7 32980 1727096605.78694: variable 'ansible_search_path' from source: unknown 32980 1727096605.78696: variable 'ansible_search_path' from source: unknown 32980 1727096605.78699: calling self._execute() 32980 1727096605.78781: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.78792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.78814: variable 'omit' from source: magic vars 32980 1727096605.79176: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.79196: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.79322: variable 'profile_stat' from source: set_fact 32980 1727096605.79340: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096605.79348: when evaluation is False, skipping this task 32980 1727096605.79361: _execute() done 32980 1727096605.79370: dumping result to json 32980 1727096605.79378: done dumping result, returning 32980 1727096605.79387: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101 [0afff68d-5257-457d-ef33-0000000007f7] 32980 1727096605.79396: sending task result for task 0afff68d-5257-457d-ef33-0000000007f7 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096605.79643: no more pending results, returning what we have 32980 1727096605.79647: results queue empty 32980 1727096605.79649: checking for any_errors_fatal 32980 1727096605.79655: done checking for any_errors_fatal 32980 1727096605.79655: checking for max_fail_percentage 32980 1727096605.79657: done checking for max_fail_percentage 32980 1727096605.79658: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.79659: done checking to see if all hosts have failed 32980 1727096605.79660: getting the remaining hosts for this loop 32980 1727096605.79662: done getting the remaining hosts for this loop 32980 1727096605.79665: getting the next task for host managed_node2 32980 1727096605.79677: done getting next task for host managed_node2 32980 1727096605.79680: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 32980 1727096605.79684: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.79689: getting variables 32980 1727096605.79690: in VariableManager get_vars() 32980 1727096605.79734: Calling all_inventory to load vars for managed_node2 32980 1727096605.79737: Calling groups_inventory to load vars for managed_node2 32980 1727096605.79740: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.79752: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.79756: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.79759: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.80305: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f7 32980 1727096605.80308: WORKER PROCESS EXITING 32980 1727096605.81409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.82932: done with get_vars() 32980 1727096605.82955: done getting variables 32980 1727096605.83017: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096605.83126: variable 'profile' from source: include params 32980 1727096605.83130: variable 'item' from source: include params 32980 1727096605.83191: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 09:03:25 -0400 (0:00:00.053) 0:00:17.758 ****** 32980 1727096605.83221: entering _queue_task() for managed_node2/command 32980 1727096605.83537: worker is 1 (out of 1 available) 32980 1727096605.83550: exiting _queue_task() for managed_node2/command 32980 1727096605.83562: done queuing things up, now waiting for results queue to drain 32980 1727096605.83564: waiting for pending results... 32980 1727096605.83839: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr101 32980 1727096605.83961: in run() - task 0afff68d-5257-457d-ef33-0000000007f8 32980 1727096605.83986: variable 'ansible_search_path' from source: unknown 32980 1727096605.84102: variable 'ansible_search_path' from source: unknown 32980 1727096605.84107: calling self._execute() 32980 1727096605.84128: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.84138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.84153: variable 'omit' from source: magic vars 32980 1727096605.84501: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.84519: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.84653: variable 'profile_stat' from source: set_fact 32980 1727096605.84674: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096605.84683: when evaluation is False, skipping this task 32980 1727096605.84690: _execute() done 32980 1727096605.84697: dumping result to json 32980 1727096605.84703: done dumping result, returning 32980 1727096605.84713: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr101 [0afff68d-5257-457d-ef33-0000000007f8] 32980 1727096605.84722: sending task result for task 0afff68d-5257-457d-ef33-0000000007f8 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096605.84909: no more pending results, returning what we have 32980 1727096605.84913: results queue empty 32980 1727096605.84914: checking for any_errors_fatal 32980 1727096605.84920: done checking for any_errors_fatal 32980 1727096605.84921: checking for max_fail_percentage 32980 1727096605.84923: done checking for max_fail_percentage 32980 1727096605.84924: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.84924: done checking to see if all hosts have failed 32980 1727096605.84925: getting the remaining hosts for this loop 32980 1727096605.84927: done getting the remaining hosts for this loop 32980 1727096605.84931: getting the next task for host managed_node2 32980 1727096605.84940: done getting next task for host managed_node2 32980 1727096605.84944: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 32980 1727096605.84949: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.84954: getting variables 32980 1727096605.84955: in VariableManager get_vars() 32980 1727096605.84999: Calling all_inventory to load vars for managed_node2 32980 1727096605.85002: Calling groups_inventory to load vars for managed_node2 32980 1727096605.85005: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.85018: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.85021: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.85024: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.85683: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f8 32980 1727096605.85686: WORKER PROCESS EXITING 32980 1727096605.86539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.88065: done with get_vars() 32980 1727096605.88092: done getting variables 32980 1727096605.88149: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096605.88256: variable 'profile' from source: include params 32980 1727096605.88260: variable 'item' from source: include params 32980 1727096605.88320: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 09:03:25 -0400 (0:00:00.051) 0:00:17.810 ****** 32980 1727096605.88352: entering _queue_task() for managed_node2/set_fact 32980 1727096605.88705: worker is 1 (out of 1 available) 32980 1727096605.88718: exiting _queue_task() for managed_node2/set_fact 32980 1727096605.88728: done queuing things up, now waiting for results queue to drain 32980 1727096605.88730: waiting for pending results... 32980 1727096605.88972: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr101 32980 1727096605.89097: in run() - task 0afff68d-5257-457d-ef33-0000000007f9 32980 1727096605.89120: variable 'ansible_search_path' from source: unknown 32980 1727096605.89128: variable 'ansible_search_path' from source: unknown 32980 1727096605.89164: calling self._execute() 32980 1727096605.89264: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.89280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.89296: variable 'omit' from source: magic vars 32980 1727096605.89645: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.89663: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.89788: variable 'profile_stat' from source: set_fact 32980 1727096605.89811: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096605.89819: when evaluation is False, skipping this task 32980 1727096605.89826: _execute() done 32980 1727096605.89833: dumping result to json 32980 1727096605.89840: done dumping result, returning 32980 1727096605.89850: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr101 [0afff68d-5257-457d-ef33-0000000007f9] 32980 1727096605.89860: sending task result for task 0afff68d-5257-457d-ef33-0000000007f9 32980 1727096605.90073: done sending task result for task 0afff68d-5257-457d-ef33-0000000007f9 32980 1727096605.90077: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096605.90124: no more pending results, returning what we have 32980 1727096605.90128: results queue empty 32980 1727096605.90129: checking for any_errors_fatal 32980 1727096605.90135: done checking for any_errors_fatal 32980 1727096605.90135: checking for max_fail_percentage 32980 1727096605.90137: done checking for max_fail_percentage 32980 1727096605.90138: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.90138: done checking to see if all hosts have failed 32980 1727096605.90139: getting the remaining hosts for this loop 32980 1727096605.90141: done getting the remaining hosts for this loop 32980 1727096605.90144: getting the next task for host managed_node2 32980 1727096605.90154: done getting next task for host managed_node2 32980 1727096605.90156: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 32980 1727096605.90159: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.90163: getting variables 32980 1727096605.90165: in VariableManager get_vars() 32980 1727096605.90208: Calling all_inventory to load vars for managed_node2 32980 1727096605.90212: Calling groups_inventory to load vars for managed_node2 32980 1727096605.90214: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.90227: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.90230: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.90232: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.91805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.93322: done with get_vars() 32980 1727096605.93348: done getting variables 32980 1727096605.93411: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096605.93523: variable 'profile' from source: include params 32980 1727096605.93527: variable 'item' from source: include params 32980 1727096605.93584: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 09:03:25 -0400 (0:00:00.052) 0:00:17.862 ****** 32980 1727096605.93616: entering _queue_task() for managed_node2/assert 32980 1727096605.93934: worker is 1 (out of 1 available) 32980 1727096605.93946: exiting _queue_task() for managed_node2/assert 32980 1727096605.93958: done queuing things up, now waiting for results queue to drain 32980 1727096605.93959: waiting for pending results... 32980 1727096605.94249: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'lsr101' 32980 1727096605.94478: in run() - task 0afff68d-5257-457d-ef33-0000000006b9 32980 1727096605.94482: variable 'ansible_search_path' from source: unknown 32980 1727096605.94485: variable 'ansible_search_path' from source: unknown 32980 1727096605.94488: calling self._execute() 32980 1727096605.94547: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.94558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.94578: variable 'omit' from source: magic vars 32980 1727096605.94966: variable 'ansible_distribution_major_version' from source: facts 32980 1727096605.94991: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096605.95047: variable 'omit' from source: magic vars 32980 1727096605.95052: variable 'omit' from source: magic vars 32980 1727096605.95159: variable 'profile' from source: include params 32980 1727096605.95176: variable 'item' from source: include params 32980 1727096605.95240: variable 'item' from source: include params 32980 1727096605.95276: variable 'omit' from source: magic vars 32980 1727096605.95378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096605.95381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096605.95395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096605.95417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096605.95433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096605.95471: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096605.95492: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.95502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.95621: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096605.95703: Set connection var ansible_timeout to 10 32980 1727096605.95706: Set connection var ansible_shell_type to sh 32980 1727096605.95708: Set connection var ansible_connection to ssh 32980 1727096605.95710: Set connection var ansible_shell_executable to /bin/sh 32980 1727096605.95713: Set connection var ansible_pipelining to False 32980 1727096605.95714: variable 'ansible_shell_executable' from source: unknown 32980 1727096605.95716: variable 'ansible_connection' from source: unknown 32980 1727096605.95719: variable 'ansible_module_compression' from source: unknown 32980 1727096605.95721: variable 'ansible_shell_type' from source: unknown 32980 1727096605.95722: variable 'ansible_shell_executable' from source: unknown 32980 1727096605.95724: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096605.95726: variable 'ansible_pipelining' from source: unknown 32980 1727096605.95728: variable 'ansible_timeout' from source: unknown 32980 1727096605.95732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096605.95880: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096605.95896: variable 'omit' from source: magic vars 32980 1727096605.95906: starting attempt loop 32980 1727096605.95923: running the handler 32980 1727096605.96045: variable 'lsr_net_profile_exists' from source: set_fact 32980 1727096605.96056: Evaluated conditional (lsr_net_profile_exists): True 32980 1727096605.96140: handler run complete 32980 1727096605.96143: attempt loop complete, returning result 32980 1727096605.96145: _execute() done 32980 1727096605.96147: dumping result to json 32980 1727096605.96149: done dumping result, returning 32980 1727096605.96151: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'lsr101' [0afff68d-5257-457d-ef33-0000000006b9] 32980 1727096605.96152: sending task result for task 0afff68d-5257-457d-ef33-0000000006b9 32980 1727096605.96221: done sending task result for task 0afff68d-5257-457d-ef33-0000000006b9 32980 1727096605.96224: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096605.96296: no more pending results, returning what we have 32980 1727096605.96299: results queue empty 32980 1727096605.96300: checking for any_errors_fatal 32980 1727096605.96308: done checking for any_errors_fatal 32980 1727096605.96308: checking for max_fail_percentage 32980 1727096605.96310: done checking for max_fail_percentage 32980 1727096605.96311: checking to see if all hosts have failed and the running result is not ok 32980 1727096605.96312: done checking to see if all hosts have failed 32980 1727096605.96313: getting the remaining hosts for this loop 32980 1727096605.96314: done getting the remaining hosts for this loop 32980 1727096605.96318: getting the next task for host managed_node2 32980 1727096605.96326: done getting next task for host managed_node2 32980 1727096605.96328: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 32980 1727096605.96332: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096605.96336: getting variables 32980 1727096605.96337: in VariableManager get_vars() 32980 1727096605.96387: Calling all_inventory to load vars for managed_node2 32980 1727096605.96390: Calling groups_inventory to load vars for managed_node2 32980 1727096605.96393: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096605.96405: Calling all_plugins_play to load vars for managed_node2 32980 1727096605.96408: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096605.96410: Calling groups_plugins_play to load vars for managed_node2 32980 1727096605.98147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096605.99920: done with get_vars() 32980 1727096605.99958: done getting variables 32980 1727096606.00024: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096606.00150: variable 'profile' from source: include params 32980 1727096606.00154: variable 'item' from source: include params 32980 1727096606.00223: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101'] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 09:03:26 -0400 (0:00:00.066) 0:00:17.929 ****** 32980 1727096606.00260: entering _queue_task() for managed_node2/assert 32980 1727096606.01060: worker is 1 (out of 1 available) 32980 1727096606.01075: exiting _queue_task() for managed_node2/assert 32980 1727096606.01178: done queuing things up, now waiting for results queue to drain 32980 1727096606.01180: waiting for pending results... 32980 1727096606.01586: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'lsr101' 32980 1727096606.01624: in run() - task 0afff68d-5257-457d-ef33-0000000006ba 32980 1727096606.01637: variable 'ansible_search_path' from source: unknown 32980 1727096606.01640: variable 'ansible_search_path' from source: unknown 32980 1727096606.01670: calling self._execute() 32980 1727096606.01963: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.02273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.02278: variable 'omit' from source: magic vars 32980 1727096606.02743: variable 'ansible_distribution_major_version' from source: facts 32980 1727096606.02763: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096606.02840: variable 'omit' from source: magic vars 32980 1727096606.02889: variable 'omit' from source: magic vars 32980 1727096606.03011: variable 'profile' from source: include params 32980 1727096606.03022: variable 'item' from source: include params 32980 1727096606.03102: variable 'item' from source: include params 32980 1727096606.03127: variable 'omit' from source: magic vars 32980 1727096606.03185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096606.03221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096606.03246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096606.03281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.03302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.03336: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096606.03343: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.03350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.03470: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096606.03492: Set connection var ansible_timeout to 10 32980 1727096606.03499: Set connection var ansible_shell_type to sh 32980 1727096606.03510: Set connection var ansible_connection to ssh 32980 1727096606.03522: Set connection var ansible_shell_executable to /bin/sh 32980 1727096606.03531: Set connection var ansible_pipelining to False 32980 1727096606.03554: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.03561: variable 'ansible_connection' from source: unknown 32980 1727096606.03567: variable 'ansible_module_compression' from source: unknown 32980 1727096606.03578: variable 'ansible_shell_type' from source: unknown 32980 1727096606.03592: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.03615: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.03618: variable 'ansible_pipelining' from source: unknown 32980 1727096606.03620: variable 'ansible_timeout' from source: unknown 32980 1727096606.03622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.03811: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096606.03815: variable 'omit' from source: magic vars 32980 1727096606.03817: starting attempt loop 32980 1727096606.03820: running the handler 32980 1727096606.03933: variable 'lsr_net_profile_ansible_managed' from source: set_fact 32980 1727096606.03950: Evaluated conditional (lsr_net_profile_ansible_managed): True 32980 1727096606.03959: handler run complete 32980 1727096606.03984: attempt loop complete, returning result 32980 1727096606.04029: _execute() done 32980 1727096606.04032: dumping result to json 32980 1727096606.04035: done dumping result, returning 32980 1727096606.04037: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'lsr101' [0afff68d-5257-457d-ef33-0000000006ba] 32980 1727096606.04039: sending task result for task 0afff68d-5257-457d-ef33-0000000006ba ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096606.04296: no more pending results, returning what we have 32980 1727096606.04300: results queue empty 32980 1727096606.04301: checking for any_errors_fatal 32980 1727096606.04306: done checking for any_errors_fatal 32980 1727096606.04307: checking for max_fail_percentage 32980 1727096606.04308: done checking for max_fail_percentage 32980 1727096606.04309: checking to see if all hosts have failed and the running result is not ok 32980 1727096606.04310: done checking to see if all hosts have failed 32980 1727096606.04311: getting the remaining hosts for this loop 32980 1727096606.04313: done getting the remaining hosts for this loop 32980 1727096606.04316: getting the next task for host managed_node2 32980 1727096606.04325: done getting next task for host managed_node2 32980 1727096606.04327: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 32980 1727096606.04331: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096606.04334: getting variables 32980 1727096606.04336: in VariableManager get_vars() 32980 1727096606.04390: Calling all_inventory to load vars for managed_node2 32980 1727096606.04393: Calling groups_inventory to load vars for managed_node2 32980 1727096606.04396: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096606.04470: done sending task result for task 0afff68d-5257-457d-ef33-0000000006ba 32980 1727096606.04476: WORKER PROCESS EXITING 32980 1727096606.04487: Calling all_plugins_play to load vars for managed_node2 32980 1727096606.04490: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096606.04493: Calling groups_plugins_play to load vars for managed_node2 32980 1727096606.06750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096606.08362: done with get_vars() 32980 1727096606.08388: done getting variables 32980 1727096606.08443: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096606.08555: variable 'profile' from source: include params 32980 1727096606.08559: variable 'item' from source: include params 32980 1727096606.08624: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 09:03:26 -0400 (0:00:00.084) 0:00:18.013 ****** 32980 1727096606.08662: entering _queue_task() for managed_node2/assert 32980 1727096606.08966: worker is 1 (out of 1 available) 32980 1727096606.08982: exiting _queue_task() for managed_node2/assert 32980 1727096606.08993: done queuing things up, now waiting for results queue to drain 32980 1727096606.08994: waiting for pending results... 32980 1727096606.09688: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in lsr101 32980 1727096606.09724: in run() - task 0afff68d-5257-457d-ef33-0000000006bb 32980 1727096606.09742: variable 'ansible_search_path' from source: unknown 32980 1727096606.09779: variable 'ansible_search_path' from source: unknown 32980 1727096606.10026: calling self._execute() 32980 1727096606.10080: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.10091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.10107: variable 'omit' from source: magic vars 32980 1727096606.11031: variable 'ansible_distribution_major_version' from source: facts 32980 1727096606.11174: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096606.11191: variable 'omit' from source: magic vars 32980 1727096606.11242: variable 'omit' from source: magic vars 32980 1727096606.11519: variable 'profile' from source: include params 32980 1727096606.11531: variable 'item' from source: include params 32980 1727096606.11650: variable 'item' from source: include params 32980 1727096606.11874: variable 'omit' from source: magic vars 32980 1727096606.11877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096606.11880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096606.11900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096606.11997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.12016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.12053: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096606.12081: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.12272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.12323: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096606.12382: Set connection var ansible_timeout to 10 32980 1727096606.12390: Set connection var ansible_shell_type to sh 32980 1727096606.12397: Set connection var ansible_connection to ssh 32980 1727096606.12411: Set connection var ansible_shell_executable to /bin/sh 32980 1727096606.12430: Set connection var ansible_pipelining to False 32980 1727096606.12456: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.12653: variable 'ansible_connection' from source: unknown 32980 1727096606.12656: variable 'ansible_module_compression' from source: unknown 32980 1727096606.12658: variable 'ansible_shell_type' from source: unknown 32980 1727096606.12660: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.12662: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.12664: variable 'ansible_pipelining' from source: unknown 32980 1727096606.12666: variable 'ansible_timeout' from source: unknown 32980 1727096606.12744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.12966: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096606.12972: variable 'omit' from source: magic vars 32980 1727096606.12974: starting attempt loop 32980 1727096606.12977: running the handler 32980 1727096606.13224: variable 'lsr_net_profile_fingerprint' from source: set_fact 32980 1727096606.13235: Evaluated conditional (lsr_net_profile_fingerprint): True 32980 1727096606.13248: handler run complete 32980 1727096606.13272: attempt loop complete, returning result 32980 1727096606.13297: _execute() done 32980 1727096606.13416: dumping result to json 32980 1727096606.13419: done dumping result, returning 32980 1727096606.13421: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in lsr101 [0afff68d-5257-457d-ef33-0000000006bb] 32980 1727096606.13423: sending task result for task 0afff68d-5257-457d-ef33-0000000006bb ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096606.13571: no more pending results, returning what we have 32980 1727096606.13575: results queue empty 32980 1727096606.13576: checking for any_errors_fatal 32980 1727096606.13584: done checking for any_errors_fatal 32980 1727096606.13585: checking for max_fail_percentage 32980 1727096606.13587: done checking for max_fail_percentage 32980 1727096606.13588: checking to see if all hosts have failed and the running result is not ok 32980 1727096606.13589: done checking to see if all hosts have failed 32980 1727096606.13590: getting the remaining hosts for this loop 32980 1727096606.13591: done getting the remaining hosts for this loop 32980 1727096606.13596: getting the next task for host managed_node2 32980 1727096606.13609: done getting next task for host managed_node2 32980 1727096606.13614: ^ task is: TASK: Include the task 'get_profile_stat.yml' 32980 1727096606.13617: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096606.13622: getting variables 32980 1727096606.13624: in VariableManager get_vars() 32980 1727096606.13831: Calling all_inventory to load vars for managed_node2 32980 1727096606.13835: Calling groups_inventory to load vars for managed_node2 32980 1727096606.13838: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096606.13851: Calling all_plugins_play to load vars for managed_node2 32980 1727096606.13855: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096606.13858: Calling groups_plugins_play to load vars for managed_node2 32980 1727096606.14384: done sending task result for task 0afff68d-5257-457d-ef33-0000000006bb 32980 1727096606.14388: WORKER PROCESS EXITING 32980 1727096606.17080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096606.22810: done with get_vars() 32980 1727096606.22831: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 09:03:26 -0400 (0:00:00.142) 0:00:18.155 ****** 32980 1727096606.22919: entering _queue_task() for managed_node2/include_tasks 32980 1727096606.23276: worker is 1 (out of 1 available) 32980 1727096606.23291: exiting _queue_task() for managed_node2/include_tasks 32980 1727096606.23305: done queuing things up, now waiting for results queue to drain 32980 1727096606.23307: waiting for pending results... 32980 1727096606.23599: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 32980 1727096606.23732: in run() - task 0afff68d-5257-457d-ef33-0000000006bf 32980 1727096606.23750: variable 'ansible_search_path' from source: unknown 32980 1727096606.23757: variable 'ansible_search_path' from source: unknown 32980 1727096606.23808: calling self._execute() 32980 1727096606.23930: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.23933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.23943: variable 'omit' from source: magic vars 32980 1727096606.24235: variable 'ansible_distribution_major_version' from source: facts 32980 1727096606.24245: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096606.24252: _execute() done 32980 1727096606.24256: dumping result to json 32980 1727096606.24259: done dumping result, returning 32980 1727096606.24265: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-457d-ef33-0000000006bf] 32980 1727096606.24272: sending task result for task 0afff68d-5257-457d-ef33-0000000006bf 32980 1727096606.24357: done sending task result for task 0afff68d-5257-457d-ef33-0000000006bf 32980 1727096606.24360: WORKER PROCESS EXITING 32980 1727096606.24394: no more pending results, returning what we have 32980 1727096606.24400: in VariableManager get_vars() 32980 1727096606.24444: Calling all_inventory to load vars for managed_node2 32980 1727096606.24446: Calling groups_inventory to load vars for managed_node2 32980 1727096606.24448: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096606.24460: Calling all_plugins_play to load vars for managed_node2 32980 1727096606.24462: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096606.24465: Calling groups_plugins_play to load vars for managed_node2 32980 1727096606.25238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096606.26646: done with get_vars() 32980 1727096606.26671: variable 'ansible_search_path' from source: unknown 32980 1727096606.26673: variable 'ansible_search_path' from source: unknown 32980 1727096606.26700: we have included files to process 32980 1727096606.26701: generating all_blocks data 32980 1727096606.26702: done generating all_blocks data 32980 1727096606.26706: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32980 1727096606.26706: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32980 1727096606.26708: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32980 1727096606.27594: done processing included file 32980 1727096606.27596: iterating over new_blocks loaded from include file 32980 1727096606.27597: in VariableManager get_vars() 32980 1727096606.27616: done with get_vars() 32980 1727096606.27618: filtering new block on tags 32980 1727096606.27641: done filtering new block on tags 32980 1727096606.27644: in VariableManager get_vars() 32980 1727096606.27662: done with get_vars() 32980 1727096606.27663: filtering new block on tags 32980 1727096606.27688: done filtering new block on tags 32980 1727096606.27691: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 32980 1727096606.27696: extending task lists for all hosts with included blocks 32980 1727096606.27862: done extending task lists 32980 1727096606.27863: done processing included files 32980 1727096606.27864: results queue empty 32980 1727096606.27865: checking for any_errors_fatal 32980 1727096606.27869: done checking for any_errors_fatal 32980 1727096606.27870: checking for max_fail_percentage 32980 1727096606.27871: done checking for max_fail_percentage 32980 1727096606.27872: checking to see if all hosts have failed and the running result is not ok 32980 1727096606.27875: done checking to see if all hosts have failed 32980 1727096606.27876: getting the remaining hosts for this loop 32980 1727096606.27877: done getting the remaining hosts for this loop 32980 1727096606.27880: getting the next task for host managed_node2 32980 1727096606.27884: done getting next task for host managed_node2 32980 1727096606.27886: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 32980 1727096606.27889: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096606.27892: getting variables 32980 1727096606.27893: in VariableManager get_vars() 32980 1727096606.27905: Calling all_inventory to load vars for managed_node2 32980 1727096606.27908: Calling groups_inventory to load vars for managed_node2 32980 1727096606.27910: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096606.27915: Calling all_plugins_play to load vars for managed_node2 32980 1727096606.27917: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096606.27923: Calling groups_plugins_play to load vars for managed_node2 32980 1727096606.28636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096606.29478: done with get_vars() 32980 1727096606.29492: done getting variables 32980 1727096606.29519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 09:03:26 -0400 (0:00:00.066) 0:00:18.222 ****** 32980 1727096606.29539: entering _queue_task() for managed_node2/set_fact 32980 1727096606.29813: worker is 1 (out of 1 available) 32980 1727096606.29825: exiting _queue_task() for managed_node2/set_fact 32980 1727096606.29838: done queuing things up, now waiting for results queue to drain 32980 1727096606.29839: waiting for pending results... 32980 1727096606.30188: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 32980 1727096606.30212: in run() - task 0afff68d-5257-457d-ef33-000000000838 32980 1727096606.30232: variable 'ansible_search_path' from source: unknown 32980 1727096606.30239: variable 'ansible_search_path' from source: unknown 32980 1727096606.30286: calling self._execute() 32980 1727096606.30381: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.30396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.30413: variable 'omit' from source: magic vars 32980 1727096606.30813: variable 'ansible_distribution_major_version' from source: facts 32980 1727096606.30844: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096606.30848: variable 'omit' from source: magic vars 32980 1727096606.30881: variable 'omit' from source: magic vars 32980 1727096606.30906: variable 'omit' from source: magic vars 32980 1727096606.30943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096606.30972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096606.30991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096606.31005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.31015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.31038: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096606.31041: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.31044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.31122: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096606.31126: Set connection var ansible_timeout to 10 32980 1727096606.31128: Set connection var ansible_shell_type to sh 32980 1727096606.31131: Set connection var ansible_connection to ssh 32980 1727096606.31138: Set connection var ansible_shell_executable to /bin/sh 32980 1727096606.31143: Set connection var ansible_pipelining to False 32980 1727096606.31160: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.31165: variable 'ansible_connection' from source: unknown 32980 1727096606.31170: variable 'ansible_module_compression' from source: unknown 32980 1727096606.31173: variable 'ansible_shell_type' from source: unknown 32980 1727096606.31175: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.31177: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.31180: variable 'ansible_pipelining' from source: unknown 32980 1727096606.31182: variable 'ansible_timeout' from source: unknown 32980 1727096606.31190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.31287: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096606.31297: variable 'omit' from source: magic vars 32980 1727096606.31300: starting attempt loop 32980 1727096606.31303: running the handler 32980 1727096606.31315: handler run complete 32980 1727096606.31324: attempt loop complete, returning result 32980 1727096606.31326: _execute() done 32980 1727096606.31329: dumping result to json 32980 1727096606.31331: done dumping result, returning 32980 1727096606.31336: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-457d-ef33-000000000838] 32980 1727096606.31341: sending task result for task 0afff68d-5257-457d-ef33-000000000838 32980 1727096606.31422: done sending task result for task 0afff68d-5257-457d-ef33-000000000838 32980 1727096606.31425: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 32980 1727096606.31476: no more pending results, returning what we have 32980 1727096606.31479: results queue empty 32980 1727096606.31479: checking for any_errors_fatal 32980 1727096606.31481: done checking for any_errors_fatal 32980 1727096606.31482: checking for max_fail_percentage 32980 1727096606.31483: done checking for max_fail_percentage 32980 1727096606.31484: checking to see if all hosts have failed and the running result is not ok 32980 1727096606.31485: done checking to see if all hosts have failed 32980 1727096606.31485: getting the remaining hosts for this loop 32980 1727096606.31487: done getting the remaining hosts for this loop 32980 1727096606.31490: getting the next task for host managed_node2 32980 1727096606.31497: done getting next task for host managed_node2 32980 1727096606.31500: ^ task is: TASK: Stat profile file 32980 1727096606.31503: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096606.31507: getting variables 32980 1727096606.31509: in VariableManager get_vars() 32980 1727096606.31547: Calling all_inventory to load vars for managed_node2 32980 1727096606.31549: Calling groups_inventory to load vars for managed_node2 32980 1727096606.31551: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096606.31561: Calling all_plugins_play to load vars for managed_node2 32980 1727096606.31563: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096606.31565: Calling groups_plugins_play to load vars for managed_node2 32980 1727096606.32316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096606.33176: done with get_vars() 32980 1727096606.33193: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 09:03:26 -0400 (0:00:00.037) 0:00:18.259 ****** 32980 1727096606.33252: entering _queue_task() for managed_node2/stat 32980 1727096606.33462: worker is 1 (out of 1 available) 32980 1727096606.33475: exiting _queue_task() for managed_node2/stat 32980 1727096606.33487: done queuing things up, now waiting for results queue to drain 32980 1727096606.33488: waiting for pending results... 32980 1727096606.33656: running TaskExecutor() for managed_node2/TASK: Stat profile file 32980 1727096606.33742: in run() - task 0afff68d-5257-457d-ef33-000000000839 32980 1727096606.33752: variable 'ansible_search_path' from source: unknown 32980 1727096606.33756: variable 'ansible_search_path' from source: unknown 32980 1727096606.33785: calling self._execute() 32980 1727096606.33851: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.33855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.33863: variable 'omit' from source: magic vars 32980 1727096606.34136: variable 'ansible_distribution_major_version' from source: facts 32980 1727096606.34148: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096606.34151: variable 'omit' from source: magic vars 32980 1727096606.34189: variable 'omit' from source: magic vars 32980 1727096606.34254: variable 'profile' from source: include params 32980 1727096606.34258: variable 'item' from source: include params 32980 1727096606.34308: variable 'item' from source: include params 32980 1727096606.34321: variable 'omit' from source: magic vars 32980 1727096606.34353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096606.34385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096606.34400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096606.34413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.34423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.34445: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096606.34448: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.34451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.34525: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096606.34529: Set connection var ansible_timeout to 10 32980 1727096606.34531: Set connection var ansible_shell_type to sh 32980 1727096606.34534: Set connection var ansible_connection to ssh 32980 1727096606.34540: Set connection var ansible_shell_executable to /bin/sh 32980 1727096606.34545: Set connection var ansible_pipelining to False 32980 1727096606.34560: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.34563: variable 'ansible_connection' from source: unknown 32980 1727096606.34566: variable 'ansible_module_compression' from source: unknown 32980 1727096606.34570: variable 'ansible_shell_type' from source: unknown 32980 1727096606.34573: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.34578: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.34581: variable 'ansible_pipelining' from source: unknown 32980 1727096606.34585: variable 'ansible_timeout' from source: unknown 32980 1727096606.34588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.34729: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096606.34738: variable 'omit' from source: magic vars 32980 1727096606.34744: starting attempt loop 32980 1727096606.34747: running the handler 32980 1727096606.34757: _low_level_execute_command(): starting 32980 1727096606.34763: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096606.35243: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.35282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.35286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 32980 1727096606.35289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096606.35291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.35325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.35334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096606.35345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.35399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.37051: stdout chunk (state=3): >>>/root <<< 32980 1727096606.37153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.37184: stderr chunk (state=3): >>><<< 32980 1727096606.37187: stdout chunk (state=3): >>><<< 32980 1727096606.37210: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096606.37221: _low_level_execute_command(): starting 32980 1727096606.37228: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203 `" && echo ansible-tmp-1727096606.3720822-33864-272644250923203="` echo /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203 `" ) && sleep 0' 32980 1727096606.37667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.37672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096606.37681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 32980 1727096606.37684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096606.37686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.37729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.37732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.37771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.39652: stdout chunk (state=3): >>>ansible-tmp-1727096606.3720822-33864-272644250923203=/root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203 <<< 32980 1727096606.39768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.39794: stderr chunk (state=3): >>><<< 32980 1727096606.39797: stdout chunk (state=3): >>><<< 32980 1727096606.39812: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096606.3720822-33864-272644250923203=/root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096606.39851: variable 'ansible_module_compression' from source: unknown 32980 1727096606.39895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32980 1727096606.39925: variable 'ansible_facts' from source: unknown 32980 1727096606.39991: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/AnsiballZ_stat.py 32980 1727096606.40091: Sending initial data 32980 1727096606.40094: Sent initial data (153 bytes) 32980 1727096606.40532: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.40535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096606.40537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096606.40539: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.40542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.40592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.40598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096606.40601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.40633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.42188: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 32980 1727096606.42199: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096606.42217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096606.42253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp4_c_45ba /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/AnsiballZ_stat.py <<< 32980 1727096606.42261: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/AnsiballZ_stat.py" <<< 32980 1727096606.42281: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp4_c_45ba" to remote "/root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/AnsiballZ_stat.py" <<< 32980 1727096606.42290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/AnsiballZ_stat.py" <<< 32980 1727096606.42775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.42811: stderr chunk (state=3): >>><<< 32980 1727096606.42815: stdout chunk (state=3): >>><<< 32980 1727096606.42851: done transferring module to remote 32980 1727096606.42859: _low_level_execute_command(): starting 32980 1727096606.42866: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/ /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/AnsiballZ_stat.py && sleep 0' 32980 1727096606.43305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.43308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096606.43311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.43313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.43319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096606.43321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.43359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.43363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.43400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.45134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.45158: stderr chunk (state=3): >>><<< 32980 1727096606.45161: stdout chunk (state=3): >>><<< 32980 1727096606.45178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096606.45181: _low_level_execute_command(): starting 32980 1727096606.45184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/AnsiballZ_stat.py && sleep 0' 32980 1727096606.45600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.45603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.45605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.45607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096606.45609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.45650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.45653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.45700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.60817: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32980 1727096606.62102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096606.62126: stderr chunk (state=3): >>><<< 32980 1727096606.62129: stdout chunk (state=3): >>><<< 32980 1727096606.62148: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096606.62175: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096606.62183: _low_level_execute_command(): starting 32980 1727096606.62190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096606.3720822-33864-272644250923203/ > /dev/null 2>&1 && sleep 0' 32980 1727096606.62627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.62630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096606.62658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.62661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.62670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.62725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.62728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096606.62734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.62768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.64684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.64706: stderr chunk (state=3): >>><<< 32980 1727096606.64709: stdout chunk (state=3): >>><<< 32980 1727096606.64726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096606.64747: handler run complete 32980 1727096606.64757: attempt loop complete, returning result 32980 1727096606.64760: _execute() done 32980 1727096606.64763: dumping result to json 32980 1727096606.64765: done dumping result, returning 32980 1727096606.64781: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0afff68d-5257-457d-ef33-000000000839] 32980 1727096606.64807: sending task result for task 0afff68d-5257-457d-ef33-000000000839 32980 1727096606.64905: done sending task result for task 0afff68d-5257-457d-ef33-000000000839 32980 1727096606.64908: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 32980 1727096606.64966: no more pending results, returning what we have 32980 1727096606.64971: results queue empty 32980 1727096606.64972: checking for any_errors_fatal 32980 1727096606.64982: done checking for any_errors_fatal 32980 1727096606.64983: checking for max_fail_percentage 32980 1727096606.64984: done checking for max_fail_percentage 32980 1727096606.64985: checking to see if all hosts have failed and the running result is not ok 32980 1727096606.64986: done checking to see if all hosts have failed 32980 1727096606.64986: getting the remaining hosts for this loop 32980 1727096606.64988: done getting the remaining hosts for this loop 32980 1727096606.64991: getting the next task for host managed_node2 32980 1727096606.65000: done getting next task for host managed_node2 32980 1727096606.65002: ^ task is: TASK: Set NM profile exist flag based on the profile files 32980 1727096606.65005: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096606.65009: getting variables 32980 1727096606.65010: in VariableManager get_vars() 32980 1727096606.65054: Calling all_inventory to load vars for managed_node2 32980 1727096606.65057: Calling groups_inventory to load vars for managed_node2 32980 1727096606.65059: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096606.65079: Calling all_plugins_play to load vars for managed_node2 32980 1727096606.65083: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096606.65090: Calling groups_plugins_play to load vars for managed_node2 32980 1727096606.65998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096606.67035: done with get_vars() 32980 1727096606.67059: done getting variables 32980 1727096606.67119: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 09:03:26 -0400 (0:00:00.338) 0:00:18.598 ****** 32980 1727096606.67148: entering _queue_task() for managed_node2/set_fact 32980 1727096606.67444: worker is 1 (out of 1 available) 32980 1727096606.67457: exiting _queue_task() for managed_node2/set_fact 32980 1727096606.67671: done queuing things up, now waiting for results queue to drain 32980 1727096606.67673: waiting for pending results... 32980 1727096606.67759: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 32980 1727096606.67860: in run() - task 0afff68d-5257-457d-ef33-00000000083a 32980 1727096606.67872: variable 'ansible_search_path' from source: unknown 32980 1727096606.67879: variable 'ansible_search_path' from source: unknown 32980 1727096606.67913: calling self._execute() 32980 1727096606.68008: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.68015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.68031: variable 'omit' from source: magic vars 32980 1727096606.68411: variable 'ansible_distribution_major_version' from source: facts 32980 1727096606.68424: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096606.68550: variable 'profile_stat' from source: set_fact 32980 1727096606.68564: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096606.68569: when evaluation is False, skipping this task 32980 1727096606.68572: _execute() done 32980 1727096606.68577: dumping result to json 32980 1727096606.68580: done dumping result, returning 32980 1727096606.68583: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-457d-ef33-00000000083a] 32980 1727096606.68586: sending task result for task 0afff68d-5257-457d-ef33-00000000083a 32980 1727096606.68686: done sending task result for task 0afff68d-5257-457d-ef33-00000000083a 32980 1727096606.68690: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096606.68737: no more pending results, returning what we have 32980 1727096606.68741: results queue empty 32980 1727096606.68742: checking for any_errors_fatal 32980 1727096606.68749: done checking for any_errors_fatal 32980 1727096606.68750: checking for max_fail_percentage 32980 1727096606.68751: done checking for max_fail_percentage 32980 1727096606.68752: checking to see if all hosts have failed and the running result is not ok 32980 1727096606.68753: done checking to see if all hosts have failed 32980 1727096606.68754: getting the remaining hosts for this loop 32980 1727096606.68755: done getting the remaining hosts for this loop 32980 1727096606.68759: getting the next task for host managed_node2 32980 1727096606.68769: done getting next task for host managed_node2 32980 1727096606.68771: ^ task is: TASK: Get NM profile info 32980 1727096606.68776: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096606.68781: getting variables 32980 1727096606.68783: in VariableManager get_vars() 32980 1727096606.68822: Calling all_inventory to load vars for managed_node2 32980 1727096606.68825: Calling groups_inventory to load vars for managed_node2 32980 1727096606.68828: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096606.68841: Calling all_plugins_play to load vars for managed_node2 32980 1727096606.68844: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096606.68847: Calling groups_plugins_play to load vars for managed_node2 32980 1727096606.70265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096606.71947: done with get_vars() 32980 1727096606.71966: done getting variables 32980 1727096606.72022: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 09:03:26 -0400 (0:00:00.049) 0:00:18.647 ****** 32980 1727096606.72050: entering _queue_task() for managed_node2/shell 32980 1727096606.72315: worker is 1 (out of 1 available) 32980 1727096606.72328: exiting _queue_task() for managed_node2/shell 32980 1727096606.72338: done queuing things up, now waiting for results queue to drain 32980 1727096606.72340: waiting for pending results... 32980 1727096606.72690: running TaskExecutor() for managed_node2/TASK: Get NM profile info 32980 1727096606.72739: in run() - task 0afff68d-5257-457d-ef33-00000000083b 32980 1727096606.72743: variable 'ansible_search_path' from source: unknown 32980 1727096606.72746: variable 'ansible_search_path' from source: unknown 32980 1727096606.72785: calling self._execute() 32980 1727096606.72876: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.72880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.72883: variable 'omit' from source: magic vars 32980 1727096606.73237: variable 'ansible_distribution_major_version' from source: facts 32980 1727096606.73283: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096606.73286: variable 'omit' from source: magic vars 32980 1727096606.73307: variable 'omit' from source: magic vars 32980 1727096606.73416: variable 'profile' from source: include params 32980 1727096606.73420: variable 'item' from source: include params 32980 1727096606.73548: variable 'item' from source: include params 32980 1727096606.73551: variable 'omit' from source: magic vars 32980 1727096606.73556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096606.73601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096606.73621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096606.73639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.73655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096606.73719: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096606.73722: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.73725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.73805: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096606.73811: Set connection var ansible_timeout to 10 32980 1727096606.73813: Set connection var ansible_shell_type to sh 32980 1727096606.73816: Set connection var ansible_connection to ssh 32980 1727096606.73826: Set connection var ansible_shell_executable to /bin/sh 32980 1727096606.73829: Set connection var ansible_pipelining to False 32980 1727096606.73876: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.73879: variable 'ansible_connection' from source: unknown 32980 1727096606.73882: variable 'ansible_module_compression' from source: unknown 32980 1727096606.73884: variable 'ansible_shell_type' from source: unknown 32980 1727096606.73886: variable 'ansible_shell_executable' from source: unknown 32980 1727096606.73888: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096606.73890: variable 'ansible_pipelining' from source: unknown 32980 1727096606.73892: variable 'ansible_timeout' from source: unknown 32980 1727096606.73894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096606.74046: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096606.74050: variable 'omit' from source: magic vars 32980 1727096606.74053: starting attempt loop 32980 1727096606.74055: running the handler 32980 1727096606.74058: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096606.74091: _low_level_execute_command(): starting 32980 1727096606.74094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096606.74896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096606.74905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.74908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.75077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.75081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.75084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096606.75087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.75187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.77012: stdout chunk (state=3): >>>/root <<< 32980 1727096606.77036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.77043: stdout chunk (state=3): >>><<< 32980 1727096606.77108: stderr chunk (state=3): >>><<< 32980 1727096606.77113: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096606.77116: _low_level_execute_command(): starting 32980 1727096606.77119: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914 `" && echo ansible-tmp-1727096606.770762-33874-187221516937914="` echo /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914 `" ) && sleep 0' 32980 1727096606.78378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096606.78381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.78384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.78698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096606.78722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.78832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.80781: stdout chunk (state=3): >>>ansible-tmp-1727096606.770762-33874-187221516937914=/root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914 <<< 32980 1727096606.81076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.81079: stdout chunk (state=3): >>><<< 32980 1727096606.81081: stderr chunk (state=3): >>><<< 32980 1727096606.81084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096606.770762-33874-187221516937914=/root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096606.81086: variable 'ansible_module_compression' from source: unknown 32980 1727096606.81088: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096606.81090: variable 'ansible_facts' from source: unknown 32980 1727096606.81158: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/AnsiballZ_command.py 32980 1727096606.81378: Sending initial data 32980 1727096606.81381: Sent initial data (155 bytes) 32980 1727096606.81871: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096606.81880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.81891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.81987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.82000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.82011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096606.82029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.82086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.83729: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096606.83764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096606.83902: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp61ub3g1t /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/AnsiballZ_command.py <<< 32980 1727096606.83906: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp61ub3g1t" to remote "/root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/AnsiballZ_command.py" <<< 32980 1727096606.85276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.85280: stderr chunk (state=3): >>><<< 32980 1727096606.85282: stdout chunk (state=3): >>><<< 32980 1727096606.85284: done transferring module to remote 32980 1727096606.85286: _low_level_execute_command(): starting 32980 1727096606.85289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/ /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/AnsiballZ_command.py && sleep 0' 32980 1727096606.86419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.86422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096606.86424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.86431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.86534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.86547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096606.87090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.87138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096606.88925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096606.88993: stderr chunk (state=3): >>><<< 32980 1727096606.88996: stdout chunk (state=3): >>><<< 32980 1727096606.89010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096606.89017: _low_level_execute_command(): starting 32980 1727096606.89034: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/AnsiballZ_command.py && sleep 0' 32980 1727096606.90425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096606.90434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096606.90654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096606.90657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096606.90659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096606.90662: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096606.90664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096606.90666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096606.90671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096606.90675: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096606.90679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096606.90787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096606.90859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096607.08095: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-23 09:03:27.061378", "end": "2024-09-23 09:03:27.079876", "delta": "0:00:00.018498", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096607.09982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096607.09986: stdout chunk (state=3): >>><<< 32980 1727096607.09988: stderr chunk (state=3): >>><<< 32980 1727096607.10008: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-23 09:03:27.061378", "end": "2024-09-23 09:03:27.079876", "delta": "0:00:00.018498", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096607.10048: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096607.10071: _low_level_execute_command(): starting 32980 1727096607.10081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096606.770762-33874-187221516937914/ > /dev/null 2>&1 && sleep 0' 32980 1727096607.11270: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096607.11337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096607.11348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096607.11366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096607.11380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096607.11485: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096607.11524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096607.11660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096607.11703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096607.11727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096607.13588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096607.13626: stderr chunk (state=3): >>><<< 32980 1727096607.13629: stdout chunk (state=3): >>><<< 32980 1727096607.13715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096607.13719: handler run complete 32980 1727096607.13721: Evaluated conditional (False): False 32980 1727096607.13723: attempt loop complete, returning result 32980 1727096607.13725: _execute() done 32980 1727096607.13727: dumping result to json 32980 1727096607.13729: done dumping result, returning 32980 1727096607.13731: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0afff68d-5257-457d-ef33-00000000083b] 32980 1727096607.13733: sending task result for task 0afff68d-5257-457d-ef33-00000000083b ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "delta": "0:00:00.018498", "end": "2024-09-23 09:03:27.079876", "rc": 0, "start": "2024-09-23 09:03:27.061378" } STDOUT: lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 32980 1727096607.13991: no more pending results, returning what we have 32980 1727096607.13995: results queue empty 32980 1727096607.13996: checking for any_errors_fatal 32980 1727096607.14002: done checking for any_errors_fatal 32980 1727096607.14003: checking for max_fail_percentage 32980 1727096607.14005: done checking for max_fail_percentage 32980 1727096607.14006: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.14007: done checking to see if all hosts have failed 32980 1727096607.14007: getting the remaining hosts for this loop 32980 1727096607.14009: done getting the remaining hosts for this loop 32980 1727096607.14013: getting the next task for host managed_node2 32980 1727096607.14022: done getting next task for host managed_node2 32980 1727096607.14025: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32980 1727096607.14029: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.14032: getting variables 32980 1727096607.14034: in VariableManager get_vars() 32980 1727096607.14183: Calling all_inventory to load vars for managed_node2 32980 1727096607.14186: Calling groups_inventory to load vars for managed_node2 32980 1727096607.14189: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.14201: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.14204: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.14208: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.15079: done sending task result for task 0afff68d-5257-457d-ef33-00000000083b 32980 1727096607.15082: WORKER PROCESS EXITING 32980 1727096607.17176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.20625: done with get_vars() 32980 1727096607.20657: done getting variables 32980 1727096607.20764: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 09:03:27 -0400 (0:00:00.488) 0:00:19.135 ****** 32980 1727096607.20905: entering _queue_task() for managed_node2/set_fact 32980 1727096607.21786: worker is 1 (out of 1 available) 32980 1727096607.21801: exiting _queue_task() for managed_node2/set_fact 32980 1727096607.21811: done queuing things up, now waiting for results queue to drain 32980 1727096607.21813: waiting for pending results... 32980 1727096607.22258: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32980 1727096607.22583: in run() - task 0afff68d-5257-457d-ef33-00000000083c 32980 1727096607.22633: variable 'ansible_search_path' from source: unknown 32980 1727096607.22684: variable 'ansible_search_path' from source: unknown 32980 1727096607.22794: calling self._execute() 32980 1727096607.23025: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.23039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.23064: variable 'omit' from source: magic vars 32980 1727096607.23940: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.24050: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.24348: variable 'nm_profile_exists' from source: set_fact 32980 1727096607.24416: Evaluated conditional (nm_profile_exists.rc == 0): True 32980 1727096607.24485: variable 'omit' from source: magic vars 32980 1727096607.24541: variable 'omit' from source: magic vars 32980 1727096607.24681: variable 'omit' from source: magic vars 32980 1727096607.24801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096607.25019: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096607.25022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096607.25024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.25026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.25063: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096607.25135: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.25146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.25325: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096607.25392: Set connection var ansible_timeout to 10 32980 1727096607.25432: Set connection var ansible_shell_type to sh 32980 1727096607.25439: Set connection var ansible_connection to ssh 32980 1727096607.25460: Set connection var ansible_shell_executable to /bin/sh 32980 1727096607.25471: Set connection var ansible_pipelining to False 32980 1727096607.25537: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.25544: variable 'ansible_connection' from source: unknown 32980 1727096607.25569: variable 'ansible_module_compression' from source: unknown 32980 1727096607.25580: variable 'ansible_shell_type' from source: unknown 32980 1727096607.25678: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.25681: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.25683: variable 'ansible_pipelining' from source: unknown 32980 1727096607.25686: variable 'ansible_timeout' from source: unknown 32980 1727096607.25689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.25976: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096607.26054: variable 'omit' from source: magic vars 32980 1727096607.26066: starting attempt loop 32980 1727096607.26110: running the handler 32980 1727096607.26128: handler run complete 32980 1727096607.26341: attempt loop complete, returning result 32980 1727096607.26344: _execute() done 32980 1727096607.26346: dumping result to json 32980 1727096607.26348: done dumping result, returning 32980 1727096607.26351: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-457d-ef33-00000000083c] 32980 1727096607.26353: sending task result for task 0afff68d-5257-457d-ef33-00000000083c ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 32980 1727096607.26539: no more pending results, returning what we have 32980 1727096607.26543: results queue empty 32980 1727096607.26543: checking for any_errors_fatal 32980 1727096607.26552: done checking for any_errors_fatal 32980 1727096607.26553: checking for max_fail_percentage 32980 1727096607.26554: done checking for max_fail_percentage 32980 1727096607.26555: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.26556: done checking to see if all hosts have failed 32980 1727096607.26557: getting the remaining hosts for this loop 32980 1727096607.26558: done getting the remaining hosts for this loop 32980 1727096607.26562: getting the next task for host managed_node2 32980 1727096607.26779: done getting next task for host managed_node2 32980 1727096607.26782: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 32980 1727096607.26786: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.26790: getting variables 32980 1727096607.26792: in VariableManager get_vars() 32980 1727096607.26830: Calling all_inventory to load vars for managed_node2 32980 1727096607.26833: Calling groups_inventory to load vars for managed_node2 32980 1727096607.26835: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.26865: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.26870: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.26876: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.27711: done sending task result for task 0afff68d-5257-457d-ef33-00000000083c 32980 1727096607.27714: WORKER PROCESS EXITING 32980 1727096607.28941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.30698: done with get_vars() 32980 1727096607.30720: done getting variables 32980 1727096607.30784: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096607.30914: variable 'profile' from source: include params 32980 1727096607.30918: variable 'item' from source: include params 32980 1727096607.30996: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101.90] ********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 09:03:27 -0400 (0:00:00.101) 0:00:19.236 ****** 32980 1727096607.31031: entering _queue_task() for managed_node2/command 32980 1727096607.31343: worker is 1 (out of 1 available) 32980 1727096607.31356: exiting _queue_task() for managed_node2/command 32980 1727096607.31370: done queuing things up, now waiting for results queue to drain 32980 1727096607.31371: waiting for pending results... 32980 1727096607.31649: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 32980 1727096607.31770: in run() - task 0afff68d-5257-457d-ef33-00000000083e 32980 1727096607.31785: variable 'ansible_search_path' from source: unknown 32980 1727096607.31789: variable 'ansible_search_path' from source: unknown 32980 1727096607.31829: calling self._execute() 32980 1727096607.31920: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.31926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.31937: variable 'omit' from source: magic vars 32980 1727096607.32315: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.32326: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.32434: variable 'profile_stat' from source: set_fact 32980 1727096607.32452: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096607.32455: when evaluation is False, skipping this task 32980 1727096607.32458: _execute() done 32980 1727096607.32461: dumping result to json 32980 1727096607.32463: done dumping result, returning 32980 1727096607.32470: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 [0afff68d-5257-457d-ef33-00000000083e] 32980 1727096607.32477: sending task result for task 0afff68d-5257-457d-ef33-00000000083e skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096607.32603: no more pending results, returning what we have 32980 1727096607.32607: results queue empty 32980 1727096607.32608: checking for any_errors_fatal 32980 1727096607.32617: done checking for any_errors_fatal 32980 1727096607.32618: checking for max_fail_percentage 32980 1727096607.32620: done checking for max_fail_percentage 32980 1727096607.32621: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.32621: done checking to see if all hosts have failed 32980 1727096607.32622: getting the remaining hosts for this loop 32980 1727096607.32624: done getting the remaining hosts for this loop 32980 1727096607.32627: getting the next task for host managed_node2 32980 1727096607.32636: done getting next task for host managed_node2 32980 1727096607.32638: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 32980 1727096607.32641: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.32645: getting variables 32980 1727096607.32647: in VariableManager get_vars() 32980 1727096607.32728: Calling all_inventory to load vars for managed_node2 32980 1727096607.32731: Calling groups_inventory to load vars for managed_node2 32980 1727096607.32734: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.32896: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.32900: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.32905: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.33473: done sending task result for task 0afff68d-5257-457d-ef33-00000000083e 32980 1727096607.33478: WORKER PROCESS EXITING 32980 1727096607.35212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.36945: done with get_vars() 32980 1727096607.36975: done getting variables 32980 1727096607.37041: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096607.37166: variable 'profile' from source: include params 32980 1727096607.37172: variable 'item' from source: include params 32980 1727096607.37233: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101.90] ******************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 09:03:27 -0400 (0:00:00.062) 0:00:19.299 ****** 32980 1727096607.37275: entering _queue_task() for managed_node2/set_fact 32980 1727096607.37631: worker is 1 (out of 1 available) 32980 1727096607.37644: exiting _queue_task() for managed_node2/set_fact 32980 1727096607.37657: done queuing things up, now waiting for results queue to drain 32980 1727096607.37658: waiting for pending results... 32980 1727096607.37956: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 32980 1727096607.38214: in run() - task 0afff68d-5257-457d-ef33-00000000083f 32980 1727096607.38218: variable 'ansible_search_path' from source: unknown 32980 1727096607.38222: variable 'ansible_search_path' from source: unknown 32980 1727096607.38225: calling self._execute() 32980 1727096607.38270: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.38285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.38312: variable 'omit' from source: magic vars 32980 1727096607.39079: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.39084: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.39183: variable 'profile_stat' from source: set_fact 32980 1727096607.39210: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096607.39218: when evaluation is False, skipping this task 32980 1727096607.39226: _execute() done 32980 1727096607.39233: dumping result to json 32980 1727096607.39241: done dumping result, returning 32980 1727096607.39251: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 [0afff68d-5257-457d-ef33-00000000083f] 32980 1727096607.39262: sending task result for task 0afff68d-5257-457d-ef33-00000000083f 32980 1727096607.39489: done sending task result for task 0afff68d-5257-457d-ef33-00000000083f skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096607.39618: no more pending results, returning what we have 32980 1727096607.39622: results queue empty 32980 1727096607.39624: checking for any_errors_fatal 32980 1727096607.39631: done checking for any_errors_fatal 32980 1727096607.39632: checking for max_fail_percentage 32980 1727096607.39634: done checking for max_fail_percentage 32980 1727096607.39634: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.39635: done checking to see if all hosts have failed 32980 1727096607.39636: getting the remaining hosts for this loop 32980 1727096607.39637: done getting the remaining hosts for this loop 32980 1727096607.39641: getting the next task for host managed_node2 32980 1727096607.39652: done getting next task for host managed_node2 32980 1727096607.39654: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 32980 1727096607.39658: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.39664: getting variables 32980 1727096607.39666: in VariableManager get_vars() 32980 1727096607.39716: Calling all_inventory to load vars for managed_node2 32980 1727096607.39720: Calling groups_inventory to load vars for managed_node2 32980 1727096607.39723: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.39736: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.39739: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.39743: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.40283: WORKER PROCESS EXITING 32980 1727096607.41860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.43523: done with get_vars() 32980 1727096607.43547: done getting variables 32980 1727096607.43615: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096607.43734: variable 'profile' from source: include params 32980 1727096607.43738: variable 'item' from source: include params 32980 1727096607.43802: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101.90] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 09:03:27 -0400 (0:00:00.065) 0:00:19.365 ****** 32980 1727096607.43833: entering _queue_task() for managed_node2/command 32980 1727096607.44192: worker is 1 (out of 1 available) 32980 1727096607.44209: exiting _queue_task() for managed_node2/command 32980 1727096607.44220: done queuing things up, now waiting for results queue to drain 32980 1727096607.44221: waiting for pending results... 32980 1727096607.44544: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr101.90 32980 1727096607.44699: in run() - task 0afff68d-5257-457d-ef33-000000000840 32980 1727096607.44778: variable 'ansible_search_path' from source: unknown 32980 1727096607.44783: variable 'ansible_search_path' from source: unknown 32980 1727096607.44786: calling self._execute() 32980 1727096607.44870: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.44892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.44910: variable 'omit' from source: magic vars 32980 1727096607.45333: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.45351: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.45487: variable 'profile_stat' from source: set_fact 32980 1727096607.45536: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096607.45539: when evaluation is False, skipping this task 32980 1727096607.45542: _execute() done 32980 1727096607.45545: dumping result to json 32980 1727096607.45547: done dumping result, returning 32980 1727096607.45550: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr101.90 [0afff68d-5257-457d-ef33-000000000840] 32980 1727096607.45552: sending task result for task 0afff68d-5257-457d-ef33-000000000840 32980 1727096607.45794: done sending task result for task 0afff68d-5257-457d-ef33-000000000840 32980 1727096607.45797: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096607.45851: no more pending results, returning what we have 32980 1727096607.45855: results queue empty 32980 1727096607.45856: checking for any_errors_fatal 32980 1727096607.45865: done checking for any_errors_fatal 32980 1727096607.45866: checking for max_fail_percentage 32980 1727096607.45870: done checking for max_fail_percentage 32980 1727096607.45871: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.45872: done checking to see if all hosts have failed 32980 1727096607.45875: getting the remaining hosts for this loop 32980 1727096607.45881: done getting the remaining hosts for this loop 32980 1727096607.45886: getting the next task for host managed_node2 32980 1727096607.45896: done getting next task for host managed_node2 32980 1727096607.45898: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 32980 1727096607.45903: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.45908: getting variables 32980 1727096607.45910: in VariableManager get_vars() 32980 1727096607.45953: Calling all_inventory to load vars for managed_node2 32980 1727096607.45956: Calling groups_inventory to load vars for managed_node2 32980 1727096607.45959: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.46103: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.46107: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.46111: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.47548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.49115: done with get_vars() 32980 1727096607.49145: done getting variables 32980 1727096607.49212: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096607.49334: variable 'profile' from source: include params 32980 1727096607.49337: variable 'item' from source: include params 32980 1727096607.49401: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101.90] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 09:03:27 -0400 (0:00:00.055) 0:00:19.420 ****** 32980 1727096607.49430: entering _queue_task() for managed_node2/set_fact 32980 1727096607.49816: worker is 1 (out of 1 available) 32980 1727096607.49828: exiting _queue_task() for managed_node2/set_fact 32980 1727096607.49840: done queuing things up, now waiting for results queue to drain 32980 1727096607.49842: waiting for pending results... 32980 1727096607.50090: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 32980 1727096607.50328: in run() - task 0afff68d-5257-457d-ef33-000000000841 32980 1727096607.50333: variable 'ansible_search_path' from source: unknown 32980 1727096607.50336: variable 'ansible_search_path' from source: unknown 32980 1727096607.50338: calling self._execute() 32980 1727096607.50404: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.50415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.50436: variable 'omit' from source: magic vars 32980 1727096607.50811: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.50828: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.50963: variable 'profile_stat' from source: set_fact 32980 1727096607.50995: Evaluated conditional (profile_stat.stat.exists): False 32980 1727096607.51098: when evaluation is False, skipping this task 32980 1727096607.51102: _execute() done 32980 1727096607.51104: dumping result to json 32980 1727096607.51106: done dumping result, returning 32980 1727096607.51109: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 [0afff68d-5257-457d-ef33-000000000841] 32980 1727096607.51111: sending task result for task 0afff68d-5257-457d-ef33-000000000841 32980 1727096607.51183: done sending task result for task 0afff68d-5257-457d-ef33-000000000841 32980 1727096607.51186: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32980 1727096607.51245: no more pending results, returning what we have 32980 1727096607.51248: results queue empty 32980 1727096607.51250: checking for any_errors_fatal 32980 1727096607.51258: done checking for any_errors_fatal 32980 1727096607.51258: checking for max_fail_percentage 32980 1727096607.51260: done checking for max_fail_percentage 32980 1727096607.51261: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.51262: done checking to see if all hosts have failed 32980 1727096607.51262: getting the remaining hosts for this loop 32980 1727096607.51264: done getting the remaining hosts for this loop 32980 1727096607.51269: getting the next task for host managed_node2 32980 1727096607.51281: done getting next task for host managed_node2 32980 1727096607.51284: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 32980 1727096607.51288: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.51292: getting variables 32980 1727096607.51370: in VariableManager get_vars() 32980 1727096607.51421: Calling all_inventory to load vars for managed_node2 32980 1727096607.51424: Calling groups_inventory to load vars for managed_node2 32980 1727096607.51426: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.51439: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.51442: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.51445: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.53016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.54641: done with get_vars() 32980 1727096607.54676: done getting variables 32980 1727096607.54735: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096607.54856: variable 'profile' from source: include params 32980 1727096607.54860: variable 'item' from source: include params 32980 1727096607.54926: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101.90'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 09:03:27 -0400 (0:00:00.055) 0:00:19.476 ****** 32980 1727096607.54960: entering _queue_task() for managed_node2/assert 32980 1727096607.55320: worker is 1 (out of 1 available) 32980 1727096607.55333: exiting _queue_task() for managed_node2/assert 32980 1727096607.55346: done queuing things up, now waiting for results queue to drain 32980 1727096607.55347: waiting for pending results... 32980 1727096607.55713: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'lsr101.90' 32980 1727096607.55720: in run() - task 0afff68d-5257-457d-ef33-0000000006c0 32980 1727096607.55723: variable 'ansible_search_path' from source: unknown 32980 1727096607.55726: variable 'ansible_search_path' from source: unknown 32980 1727096607.55729: calling self._execute() 32980 1727096607.55810: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.55815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.55819: variable 'omit' from source: magic vars 32980 1727096607.56169: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.56183: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.56188: variable 'omit' from source: magic vars 32980 1727096607.56229: variable 'omit' from source: magic vars 32980 1727096607.56322: variable 'profile' from source: include params 32980 1727096607.56325: variable 'item' from source: include params 32980 1727096607.56392: variable 'item' from source: include params 32980 1727096607.56460: variable 'omit' from source: magic vars 32980 1727096607.56464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096607.56498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096607.56519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096607.56535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.56548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.56583: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096607.56587: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.56589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.56775: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096607.56785: Set connection var ansible_timeout to 10 32980 1727096607.56787: Set connection var ansible_shell_type to sh 32980 1727096607.56789: Set connection var ansible_connection to ssh 32980 1727096607.56791: Set connection var ansible_shell_executable to /bin/sh 32980 1727096607.56794: Set connection var ansible_pipelining to False 32980 1727096607.56796: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.56798: variable 'ansible_connection' from source: unknown 32980 1727096607.56800: variable 'ansible_module_compression' from source: unknown 32980 1727096607.56801: variable 'ansible_shell_type' from source: unknown 32980 1727096607.56803: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.56805: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.56807: variable 'ansible_pipelining' from source: unknown 32980 1727096607.56809: variable 'ansible_timeout' from source: unknown 32980 1727096607.56811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.56999: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096607.57003: variable 'omit' from source: magic vars 32980 1727096607.57005: starting attempt loop 32980 1727096607.57008: running the handler 32980 1727096607.57066: variable 'lsr_net_profile_exists' from source: set_fact 32980 1727096607.57076: Evaluated conditional (lsr_net_profile_exists): True 32980 1727096607.57080: handler run complete 32980 1727096607.57094: attempt loop complete, returning result 32980 1727096607.57097: _execute() done 32980 1727096607.57099: dumping result to json 32980 1727096607.57107: done dumping result, returning 32980 1727096607.57113: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'lsr101.90' [0afff68d-5257-457d-ef33-0000000006c0] 32980 1727096607.57119: sending task result for task 0afff68d-5257-457d-ef33-0000000006c0 32980 1727096607.57212: done sending task result for task 0afff68d-5257-457d-ef33-0000000006c0 32980 1727096607.57215: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096607.57296: no more pending results, returning what we have 32980 1727096607.57299: results queue empty 32980 1727096607.57300: checking for any_errors_fatal 32980 1727096607.57306: done checking for any_errors_fatal 32980 1727096607.57306: checking for max_fail_percentage 32980 1727096607.57308: done checking for max_fail_percentage 32980 1727096607.57309: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.57310: done checking to see if all hosts have failed 32980 1727096607.57310: getting the remaining hosts for this loop 32980 1727096607.57312: done getting the remaining hosts for this loop 32980 1727096607.57315: getting the next task for host managed_node2 32980 1727096607.57322: done getting next task for host managed_node2 32980 1727096607.57328: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 32980 1727096607.57331: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.57335: getting variables 32980 1727096607.57336: in VariableManager get_vars() 32980 1727096607.57376: Calling all_inventory to load vars for managed_node2 32980 1727096607.57379: Calling groups_inventory to load vars for managed_node2 32980 1727096607.57381: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.57390: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.57392: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.57394: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.58922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.60922: done with get_vars() 32980 1727096607.60943: done getting variables 32980 1727096607.61004: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096607.61119: variable 'profile' from source: include params 32980 1727096607.61123: variable 'item' from source: include params 32980 1727096607.61183: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101.90'] ******* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 09:03:27 -0400 (0:00:00.062) 0:00:19.538 ****** 32980 1727096607.61219: entering _queue_task() for managed_node2/assert 32980 1727096607.61544: worker is 1 (out of 1 available) 32980 1727096607.61554: exiting _queue_task() for managed_node2/assert 32980 1727096607.61566: done queuing things up, now waiting for results queue to drain 32980 1727096607.61569: waiting for pending results... 32980 1727096607.62004: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'lsr101.90' 32980 1727096607.62010: in run() - task 0afff68d-5257-457d-ef33-0000000006c1 32980 1727096607.62014: variable 'ansible_search_path' from source: unknown 32980 1727096607.62017: variable 'ansible_search_path' from source: unknown 32980 1727096607.62020: calling self._execute() 32980 1727096607.62208: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.62214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.62219: variable 'omit' from source: magic vars 32980 1727096607.62488: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.62499: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.62505: variable 'omit' from source: magic vars 32980 1727096607.62541: variable 'omit' from source: magic vars 32980 1727096607.62646: variable 'profile' from source: include params 32980 1727096607.62651: variable 'item' from source: include params 32980 1727096607.62716: variable 'item' from source: include params 32980 1727096607.62738: variable 'omit' from source: magic vars 32980 1727096607.62783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096607.62822: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096607.62842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096607.62863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.62876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.62903: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096607.62912: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.62915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.63018: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096607.63030: Set connection var ansible_timeout to 10 32980 1727096607.63033: Set connection var ansible_shell_type to sh 32980 1727096607.63035: Set connection var ansible_connection to ssh 32980 1727096607.63043: Set connection var ansible_shell_executable to /bin/sh 32980 1727096607.63047: Set connection var ansible_pipelining to False 32980 1727096607.63082: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.63086: variable 'ansible_connection' from source: unknown 32980 1727096607.63088: variable 'ansible_module_compression' from source: unknown 32980 1727096607.63090: variable 'ansible_shell_type' from source: unknown 32980 1727096607.63093: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.63095: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.63097: variable 'ansible_pipelining' from source: unknown 32980 1727096607.63099: variable 'ansible_timeout' from source: unknown 32980 1727096607.63101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.63275: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096607.63279: variable 'omit' from source: magic vars 32980 1727096607.63282: starting attempt loop 32980 1727096607.63284: running the handler 32980 1727096607.63343: variable 'lsr_net_profile_ansible_managed' from source: set_fact 32980 1727096607.63349: Evaluated conditional (lsr_net_profile_ansible_managed): True 32980 1727096607.63360: handler run complete 32980 1727096607.63408: attempt loop complete, returning result 32980 1727096607.63412: _execute() done 32980 1727096607.63414: dumping result to json 32980 1727096607.63417: done dumping result, returning 32980 1727096607.63419: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'lsr101.90' [0afff68d-5257-457d-ef33-0000000006c1] 32980 1727096607.63421: sending task result for task 0afff68d-5257-457d-ef33-0000000006c1 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096607.63536: no more pending results, returning what we have 32980 1727096607.63540: results queue empty 32980 1727096607.63542: checking for any_errors_fatal 32980 1727096607.63549: done checking for any_errors_fatal 32980 1727096607.63550: checking for max_fail_percentage 32980 1727096607.63552: done checking for max_fail_percentage 32980 1727096607.63553: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.63554: done checking to see if all hosts have failed 32980 1727096607.63555: getting the remaining hosts for this loop 32980 1727096607.63556: done getting the remaining hosts for this loop 32980 1727096607.63560: getting the next task for host managed_node2 32980 1727096607.63571: done getting next task for host managed_node2 32980 1727096607.63574: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 32980 1727096607.63577: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.63581: getting variables 32980 1727096607.63583: in VariableManager get_vars() 32980 1727096607.63630: Calling all_inventory to load vars for managed_node2 32980 1727096607.63633: Calling groups_inventory to load vars for managed_node2 32980 1727096607.63636: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.63648: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.63652: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.63655: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.64211: done sending task result for task 0afff68d-5257-457d-ef33-0000000006c1 32980 1727096607.64215: WORKER PROCESS EXITING 32980 1727096607.65413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.67078: done with get_vars() 32980 1727096607.67105: done getting variables 32980 1727096607.67159: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096607.67283: variable 'profile' from source: include params 32980 1727096607.67287: variable 'item' from source: include params 32980 1727096607.67339: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101.90] ************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 09:03:27 -0400 (0:00:00.061) 0:00:19.600 ****** 32980 1727096607.67391: entering _queue_task() for managed_node2/assert 32980 1727096607.67775: worker is 1 (out of 1 available) 32980 1727096607.67790: exiting _queue_task() for managed_node2/assert 32980 1727096607.67801: done queuing things up, now waiting for results queue to drain 32980 1727096607.67803: waiting for pending results... 32980 1727096607.68281: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in lsr101.90 32980 1727096607.68288: in run() - task 0afff68d-5257-457d-ef33-0000000006c2 32980 1727096607.68290: variable 'ansible_search_path' from source: unknown 32980 1727096607.68293: variable 'ansible_search_path' from source: unknown 32980 1727096607.68295: calling self._execute() 32980 1727096607.68339: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.68343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.68356: variable 'omit' from source: magic vars 32980 1727096607.68725: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.68741: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.68744: variable 'omit' from source: magic vars 32980 1727096607.68791: variable 'omit' from source: magic vars 32980 1727096607.68915: variable 'profile' from source: include params 32980 1727096607.68918: variable 'item' from source: include params 32980 1727096607.68982: variable 'item' from source: include params 32980 1727096607.69001: variable 'omit' from source: magic vars 32980 1727096607.69056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096607.69090: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096607.69110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096607.69127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.69138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.69178: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096607.69181: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.69183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.69289: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096607.69376: Set connection var ansible_timeout to 10 32980 1727096607.69380: Set connection var ansible_shell_type to sh 32980 1727096607.69382: Set connection var ansible_connection to ssh 32980 1727096607.69386: Set connection var ansible_shell_executable to /bin/sh 32980 1727096607.69388: Set connection var ansible_pipelining to False 32980 1727096607.69390: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.69392: variable 'ansible_connection' from source: unknown 32980 1727096607.69394: variable 'ansible_module_compression' from source: unknown 32980 1727096607.69396: variable 'ansible_shell_type' from source: unknown 32980 1727096607.69398: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.69408: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.69413: variable 'ansible_pipelining' from source: unknown 32980 1727096607.69416: variable 'ansible_timeout' from source: unknown 32980 1727096607.69418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.69520: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096607.69527: variable 'omit' from source: magic vars 32980 1727096607.69529: starting attempt loop 32980 1727096607.69532: running the handler 32980 1727096607.69646: variable 'lsr_net_profile_fingerprint' from source: set_fact 32980 1727096607.69652: Evaluated conditional (lsr_net_profile_fingerprint): True 32980 1727096607.69658: handler run complete 32980 1727096607.69677: attempt loop complete, returning result 32980 1727096607.69680: _execute() done 32980 1727096607.69683: dumping result to json 32980 1727096607.69685: done dumping result, returning 32980 1727096607.69688: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in lsr101.90 [0afff68d-5257-457d-ef33-0000000006c2] 32980 1727096607.69737: sending task result for task 0afff68d-5257-457d-ef33-0000000006c2 32980 1727096607.69802: done sending task result for task 0afff68d-5257-457d-ef33-0000000006c2 32980 1727096607.69805: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 32980 1727096607.69900: no more pending results, returning what we have 32980 1727096607.69903: results queue empty 32980 1727096607.69904: checking for any_errors_fatal 32980 1727096607.69910: done checking for any_errors_fatal 32980 1727096607.69911: checking for max_fail_percentage 32980 1727096607.69913: done checking for max_fail_percentage 32980 1727096607.69913: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.69914: done checking to see if all hosts have failed 32980 1727096607.69915: getting the remaining hosts for this loop 32980 1727096607.69916: done getting the remaining hosts for this loop 32980 1727096607.69919: getting the next task for host managed_node2 32980 1727096607.69928: done getting next task for host managed_node2 32980 1727096607.69930: ^ task is: TASK: TEARDOWN: remove profiles. 32980 1727096607.69932: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.69935: getting variables 32980 1727096607.69937: in VariableManager get_vars() 32980 1727096607.70078: Calling all_inventory to load vars for managed_node2 32980 1727096607.70082: Calling groups_inventory to load vars for managed_node2 32980 1727096607.70085: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.70094: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.70097: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.70100: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.71689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.73348: done with get_vars() 32980 1727096607.73371: done getting variables 32980 1727096607.73437: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:58 Monday 23 September 2024 09:03:27 -0400 (0:00:00.060) 0:00:19.661 ****** 32980 1727096607.73465: entering _queue_task() for managed_node2/debug 32980 1727096607.73991: worker is 1 (out of 1 available) 32980 1727096607.74001: exiting _queue_task() for managed_node2/debug 32980 1727096607.74011: done queuing things up, now waiting for results queue to drain 32980 1727096607.74012: waiting for pending results... 32980 1727096607.74145: running TaskExecutor() for managed_node2/TASK: TEARDOWN: remove profiles. 32980 1727096607.74242: in run() - task 0afff68d-5257-457d-ef33-00000000005d 32980 1727096607.74245: variable 'ansible_search_path' from source: unknown 32980 1727096607.74259: calling self._execute() 32980 1727096607.74609: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.74614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.74617: variable 'omit' from source: magic vars 32980 1727096607.74953: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.74972: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.74987: variable 'omit' from source: magic vars 32980 1727096607.75020: variable 'omit' from source: magic vars 32980 1727096607.75060: variable 'omit' from source: magic vars 32980 1727096607.75115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096607.75152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096607.75221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096607.75224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.75226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096607.75250: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096607.75257: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.75264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.75382: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096607.75392: Set connection var ansible_timeout to 10 32980 1727096607.75399: Set connection var ansible_shell_type to sh 32980 1727096607.75439: Set connection var ansible_connection to ssh 32980 1727096607.75442: Set connection var ansible_shell_executable to /bin/sh 32980 1727096607.75444: Set connection var ansible_pipelining to False 32980 1727096607.75454: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.75460: variable 'ansible_connection' from source: unknown 32980 1727096607.75466: variable 'ansible_module_compression' from source: unknown 32980 1727096607.75477: variable 'ansible_shell_type' from source: unknown 32980 1727096607.75483: variable 'ansible_shell_executable' from source: unknown 32980 1727096607.75489: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.75548: variable 'ansible_pipelining' from source: unknown 32980 1727096607.75551: variable 'ansible_timeout' from source: unknown 32980 1727096607.75553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.75656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096607.75680: variable 'omit' from source: magic vars 32980 1727096607.75692: starting attempt loop 32980 1727096607.75698: running the handler 32980 1727096607.75747: handler run complete 32980 1727096607.75784: attempt loop complete, returning result 32980 1727096607.75790: _execute() done 32980 1727096607.75796: dumping result to json 32980 1727096607.75879: done dumping result, returning 32980 1727096607.75882: done running TaskExecutor() for managed_node2/TASK: TEARDOWN: remove profiles. [0afff68d-5257-457d-ef33-00000000005d] 32980 1727096607.75884: sending task result for task 0afff68d-5257-457d-ef33-00000000005d 32980 1727096607.75950: done sending task result for task 0afff68d-5257-457d-ef33-00000000005d 32980 1727096607.75953: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 32980 1727096607.76006: no more pending results, returning what we have 32980 1727096607.76010: results queue empty 32980 1727096607.76011: checking for any_errors_fatal 32980 1727096607.76017: done checking for any_errors_fatal 32980 1727096607.76017: checking for max_fail_percentage 32980 1727096607.76020: done checking for max_fail_percentage 32980 1727096607.76020: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.76021: done checking to see if all hosts have failed 32980 1727096607.76022: getting the remaining hosts for this loop 32980 1727096607.76023: done getting the remaining hosts for this loop 32980 1727096607.76027: getting the next task for host managed_node2 32980 1727096607.76035: done getting next task for host managed_node2 32980 1727096607.76042: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32980 1727096607.76045: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.76062: getting variables 32980 1727096607.76064: in VariableManager get_vars() 32980 1727096607.76107: Calling all_inventory to load vars for managed_node2 32980 1727096607.76110: Calling groups_inventory to load vars for managed_node2 32980 1727096607.76113: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.76123: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.76126: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.76129: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.77704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.79309: done with get_vars() 32980 1727096607.79336: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:03:27 -0400 (0:00:00.059) 0:00:19.721 ****** 32980 1727096607.79444: entering _queue_task() for managed_node2/include_tasks 32980 1727096607.79750: worker is 1 (out of 1 available) 32980 1727096607.79879: exiting _queue_task() for managed_node2/include_tasks 32980 1727096607.79891: done queuing things up, now waiting for results queue to drain 32980 1727096607.79892: waiting for pending results... 32980 1727096607.80189: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32980 1727096607.80220: in run() - task 0afff68d-5257-457d-ef33-000000000065 32980 1727096607.80286: variable 'ansible_search_path' from source: unknown 32980 1727096607.80289: variable 'ansible_search_path' from source: unknown 32980 1727096607.80293: calling self._execute() 32980 1727096607.80390: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.80404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.80418: variable 'omit' from source: magic vars 32980 1727096607.80811: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.80841: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.80940: _execute() done 32980 1727096607.80944: dumping result to json 32980 1727096607.80947: done dumping result, returning 32980 1727096607.80949: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-457d-ef33-000000000065] 32980 1727096607.80952: sending task result for task 0afff68d-5257-457d-ef33-000000000065 32980 1727096607.81029: done sending task result for task 0afff68d-5257-457d-ef33-000000000065 32980 1727096607.81033: WORKER PROCESS EXITING 32980 1727096607.81089: no more pending results, returning what we have 32980 1727096607.81094: in VariableManager get_vars() 32980 1727096607.81139: Calling all_inventory to load vars for managed_node2 32980 1727096607.81142: Calling groups_inventory to load vars for managed_node2 32980 1727096607.81145: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.81163: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.81166: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.81171: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.82908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.84507: done with get_vars() 32980 1727096607.84532: variable 'ansible_search_path' from source: unknown 32980 1727096607.84533: variable 'ansible_search_path' from source: unknown 32980 1727096607.84580: we have included files to process 32980 1727096607.84582: generating all_blocks data 32980 1727096607.84584: done generating all_blocks data 32980 1727096607.84591: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32980 1727096607.84592: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32980 1727096607.84594: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32980 1727096607.85177: done processing included file 32980 1727096607.85179: iterating over new_blocks loaded from include file 32980 1727096607.85181: in VariableManager get_vars() 32980 1727096607.85205: done with get_vars() 32980 1727096607.85207: filtering new block on tags 32980 1727096607.85225: done filtering new block on tags 32980 1727096607.85228: in VariableManager get_vars() 32980 1727096607.85249: done with get_vars() 32980 1727096607.85250: filtering new block on tags 32980 1727096607.85280: done filtering new block on tags 32980 1727096607.85284: in VariableManager get_vars() 32980 1727096607.85308: done with get_vars() 32980 1727096607.85310: filtering new block on tags 32980 1727096607.85327: done filtering new block on tags 32980 1727096607.85329: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 32980 1727096607.85334: extending task lists for all hosts with included blocks 32980 1727096607.86171: done extending task lists 32980 1727096607.86174: done processing included files 32980 1727096607.86176: results queue empty 32980 1727096607.86176: checking for any_errors_fatal 32980 1727096607.86179: done checking for any_errors_fatal 32980 1727096607.86180: checking for max_fail_percentage 32980 1727096607.86181: done checking for max_fail_percentage 32980 1727096607.86182: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.86183: done checking to see if all hosts have failed 32980 1727096607.86184: getting the remaining hosts for this loop 32980 1727096607.86185: done getting the remaining hosts for this loop 32980 1727096607.86187: getting the next task for host managed_node2 32980 1727096607.86191: done getting next task for host managed_node2 32980 1727096607.86194: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32980 1727096607.86197: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.86207: getting variables 32980 1727096607.86208: in VariableManager get_vars() 32980 1727096607.86225: Calling all_inventory to load vars for managed_node2 32980 1727096607.86227: Calling groups_inventory to load vars for managed_node2 32980 1727096607.86229: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.86239: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.86242: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.86245: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.87490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.89160: done with get_vars() 32980 1727096607.89190: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:03:27 -0400 (0:00:00.098) 0:00:19.819 ****** 32980 1727096607.89271: entering _queue_task() for managed_node2/setup 32980 1727096607.89708: worker is 1 (out of 1 available) 32980 1727096607.89726: exiting _queue_task() for managed_node2/setup 32980 1727096607.89738: done queuing things up, now waiting for results queue to drain 32980 1727096607.89740: waiting for pending results... 32980 1727096607.90059: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32980 1727096607.90179: in run() - task 0afff68d-5257-457d-ef33-000000000883 32980 1727096607.90200: variable 'ansible_search_path' from source: unknown 32980 1727096607.90209: variable 'ansible_search_path' from source: unknown 32980 1727096607.90251: calling self._execute() 32980 1727096607.90378: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096607.90382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096607.90389: variable 'omit' from source: magic vars 32980 1727096607.90810: variable 'ansible_distribution_major_version' from source: facts 32980 1727096607.90813: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096607.91011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096607.93202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096607.93332: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096607.93355: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096607.93401: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096607.93441: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096607.93534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096607.93583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096607.93658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096607.93664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096607.93690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096607.93748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096607.93791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096607.93821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096607.93866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096607.93988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096607.94070: variable '__network_required_facts' from source: role '' defaults 32980 1727096607.94203: variable 'ansible_facts' from source: unknown 32980 1727096607.94976: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 32980 1727096607.94986: when evaluation is False, skipping this task 32980 1727096607.94993: _execute() done 32980 1727096607.94999: dumping result to json 32980 1727096607.95006: done dumping result, returning 32980 1727096607.95017: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-457d-ef33-000000000883] 32980 1727096607.95025: sending task result for task 0afff68d-5257-457d-ef33-000000000883 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096607.95225: no more pending results, returning what we have 32980 1727096607.95230: results queue empty 32980 1727096607.95231: checking for any_errors_fatal 32980 1727096607.95233: done checking for any_errors_fatal 32980 1727096607.95233: checking for max_fail_percentage 32980 1727096607.95235: done checking for max_fail_percentage 32980 1727096607.95236: checking to see if all hosts have failed and the running result is not ok 32980 1727096607.95237: done checking to see if all hosts have failed 32980 1727096607.95237: getting the remaining hosts for this loop 32980 1727096607.95239: done getting the remaining hosts for this loop 32980 1727096607.95243: getting the next task for host managed_node2 32980 1727096607.95254: done getting next task for host managed_node2 32980 1727096607.95258: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 32980 1727096607.95262: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096607.95284: getting variables 32980 1727096607.95291: in VariableManager get_vars() 32980 1727096607.95334: Calling all_inventory to load vars for managed_node2 32980 1727096607.95337: Calling groups_inventory to load vars for managed_node2 32980 1727096607.95340: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096607.95351: Calling all_plugins_play to load vars for managed_node2 32980 1727096607.95354: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096607.95357: Calling groups_plugins_play to load vars for managed_node2 32980 1727096607.95882: done sending task result for task 0afff68d-5257-457d-ef33-000000000883 32980 1727096607.95886: WORKER PROCESS EXITING 32980 1727096607.97008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096607.98725: done with get_vars() 32980 1727096607.98759: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:03:27 -0400 (0:00:00.095) 0:00:19.915 ****** 32980 1727096607.98882: entering _queue_task() for managed_node2/stat 32980 1727096607.99286: worker is 1 (out of 1 available) 32980 1727096607.99300: exiting _queue_task() for managed_node2/stat 32980 1727096607.99424: done queuing things up, now waiting for results queue to drain 32980 1727096607.99426: waiting for pending results... 32980 1727096607.99659: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 32980 1727096607.99866: in run() - task 0afff68d-5257-457d-ef33-000000000885 32980 1727096607.99899: variable 'ansible_search_path' from source: unknown 32980 1727096607.99907: variable 'ansible_search_path' from source: unknown 32980 1727096607.99946: calling self._execute() 32980 1727096608.00045: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096608.00076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096608.00080: variable 'omit' from source: magic vars 32980 1727096608.00507: variable 'ansible_distribution_major_version' from source: facts 32980 1727096608.00511: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096608.00710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096608.01002: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096608.01054: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096608.01161: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096608.01164: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096608.01221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096608.01251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096608.01296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096608.01327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096608.01431: variable '__network_is_ostree' from source: set_fact 32980 1727096608.01442: Evaluated conditional (not __network_is_ostree is defined): False 32980 1727096608.01448: when evaluation is False, skipping this task 32980 1727096608.01455: _execute() done 32980 1727096608.01461: dumping result to json 32980 1727096608.01466: done dumping result, returning 32980 1727096608.01481: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-457d-ef33-000000000885] 32980 1727096608.01489: sending task result for task 0afff68d-5257-457d-ef33-000000000885 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32980 1727096608.01723: no more pending results, returning what we have 32980 1727096608.01727: results queue empty 32980 1727096608.01728: checking for any_errors_fatal 32980 1727096608.01736: done checking for any_errors_fatal 32980 1727096608.01737: checking for max_fail_percentage 32980 1727096608.01739: done checking for max_fail_percentage 32980 1727096608.01739: checking to see if all hosts have failed and the running result is not ok 32980 1727096608.01740: done checking to see if all hosts have failed 32980 1727096608.01741: getting the remaining hosts for this loop 32980 1727096608.01743: done getting the remaining hosts for this loop 32980 1727096608.01746: getting the next task for host managed_node2 32980 1727096608.01756: done getting next task for host managed_node2 32980 1727096608.01760: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32980 1727096608.01764: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096608.01983: getting variables 32980 1727096608.01985: in VariableManager get_vars() 32980 1727096608.02022: Calling all_inventory to load vars for managed_node2 32980 1727096608.02025: Calling groups_inventory to load vars for managed_node2 32980 1727096608.02027: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096608.02035: Calling all_plugins_play to load vars for managed_node2 32980 1727096608.02038: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096608.02041: Calling groups_plugins_play to load vars for managed_node2 32980 1727096608.02584: done sending task result for task 0afff68d-5257-457d-ef33-000000000885 32980 1727096608.02588: WORKER PROCESS EXITING 32980 1727096608.08444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096608.10104: done with get_vars() 32980 1727096608.10130: done getting variables 32980 1727096608.10193: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:03:28 -0400 (0:00:00.113) 0:00:20.028 ****** 32980 1727096608.10225: entering _queue_task() for managed_node2/set_fact 32980 1727096608.10610: worker is 1 (out of 1 available) 32980 1727096608.10624: exiting _queue_task() for managed_node2/set_fact 32980 1727096608.10636: done queuing things up, now waiting for results queue to drain 32980 1727096608.10638: waiting for pending results... 32980 1727096608.10833: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32980 1727096608.10996: in run() - task 0afff68d-5257-457d-ef33-000000000886 32980 1727096608.11020: variable 'ansible_search_path' from source: unknown 32980 1727096608.11034: variable 'ansible_search_path' from source: unknown 32980 1727096608.11083: calling self._execute() 32980 1727096608.11204: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096608.11216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096608.11251: variable 'omit' from source: magic vars 32980 1727096608.11697: variable 'ansible_distribution_major_version' from source: facts 32980 1727096608.11799: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096608.11917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096608.12234: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096608.12288: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096608.12376: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096608.12417: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096608.12521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096608.12564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096608.12601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096608.12633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096608.12736: variable '__network_is_ostree' from source: set_fact 32980 1727096608.12749: Evaluated conditional (not __network_is_ostree is defined): False 32980 1727096608.12782: when evaluation is False, skipping this task 32980 1727096608.12786: _execute() done 32980 1727096608.12789: dumping result to json 32980 1727096608.12791: done dumping result, returning 32980 1727096608.12878: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-457d-ef33-000000000886] 32980 1727096608.12883: sending task result for task 0afff68d-5257-457d-ef33-000000000886 32980 1727096608.12956: done sending task result for task 0afff68d-5257-457d-ef33-000000000886 32980 1727096608.12959: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32980 1727096608.13016: no more pending results, returning what we have 32980 1727096608.13020: results queue empty 32980 1727096608.13021: checking for any_errors_fatal 32980 1727096608.13028: done checking for any_errors_fatal 32980 1727096608.13029: checking for max_fail_percentage 32980 1727096608.13031: done checking for max_fail_percentage 32980 1727096608.13032: checking to see if all hosts have failed and the running result is not ok 32980 1727096608.13033: done checking to see if all hosts have failed 32980 1727096608.13034: getting the remaining hosts for this loop 32980 1727096608.13036: done getting the remaining hosts for this loop 32980 1727096608.13039: getting the next task for host managed_node2 32980 1727096608.13052: done getting next task for host managed_node2 32980 1727096608.13056: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 32980 1727096608.13059: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096608.13083: getting variables 32980 1727096608.13085: in VariableManager get_vars() 32980 1727096608.13129: Calling all_inventory to load vars for managed_node2 32980 1727096608.13132: Calling groups_inventory to load vars for managed_node2 32980 1727096608.13135: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096608.13146: Calling all_plugins_play to load vars for managed_node2 32980 1727096608.13149: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096608.13152: Calling groups_plugins_play to load vars for managed_node2 32980 1727096608.14830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096608.16526: done with get_vars() 32980 1727096608.16550: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:03:28 -0400 (0:00:00.064) 0:00:20.093 ****** 32980 1727096608.16662: entering _queue_task() for managed_node2/service_facts 32980 1727096608.17027: worker is 1 (out of 1 available) 32980 1727096608.17040: exiting _queue_task() for managed_node2/service_facts 32980 1727096608.17054: done queuing things up, now waiting for results queue to drain 32980 1727096608.17055: waiting for pending results... 32980 1727096608.17524: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 32980 1727096608.17583: in run() - task 0afff68d-5257-457d-ef33-000000000888 32980 1727096608.17587: variable 'ansible_search_path' from source: unknown 32980 1727096608.17590: variable 'ansible_search_path' from source: unknown 32980 1727096608.17631: calling self._execute() 32980 1727096608.17741: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096608.17753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096608.17800: variable 'omit' from source: magic vars 32980 1727096608.18203: variable 'ansible_distribution_major_version' from source: facts 32980 1727096608.18221: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096608.18237: variable 'omit' from source: magic vars 32980 1727096608.18321: variable 'omit' from source: magic vars 32980 1727096608.18385: variable 'omit' from source: magic vars 32980 1727096608.18453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096608.18466: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096608.18503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096608.18526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096608.18544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096608.18602: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096608.18605: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096608.18672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096608.18735: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096608.18747: Set connection var ansible_timeout to 10 32980 1727096608.18754: Set connection var ansible_shell_type to sh 32980 1727096608.18762: Set connection var ansible_connection to ssh 32980 1727096608.18784: Set connection var ansible_shell_executable to /bin/sh 32980 1727096608.18796: Set connection var ansible_pipelining to False 32980 1727096608.18829: variable 'ansible_shell_executable' from source: unknown 32980 1727096608.18838: variable 'ansible_connection' from source: unknown 32980 1727096608.18847: variable 'ansible_module_compression' from source: unknown 32980 1727096608.18854: variable 'ansible_shell_type' from source: unknown 32980 1727096608.18861: variable 'ansible_shell_executable' from source: unknown 32980 1727096608.18889: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096608.18892: variable 'ansible_pipelining' from source: unknown 32980 1727096608.18894: variable 'ansible_timeout' from source: unknown 32980 1727096608.18896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096608.19123: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096608.19216: variable 'omit' from source: magic vars 32980 1727096608.19219: starting attempt loop 32980 1727096608.19222: running the handler 32980 1727096608.19224: _low_level_execute_command(): starting 32980 1727096608.19226: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096608.20032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096608.20115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096608.20139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096608.20154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096608.20236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096608.21919: stdout chunk (state=3): >>>/root <<< 32980 1727096608.22003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096608.22037: stderr chunk (state=3): >>><<< 32980 1727096608.22039: stdout chunk (state=3): >>><<< 32980 1727096608.22050: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096608.22071: _low_level_execute_command(): starting 32980 1727096608.22086: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928 `" && echo ansible-tmp-1727096608.2205627-33926-222495103405928="` echo /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928 `" ) && sleep 0' 32980 1727096608.22511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096608.22571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096608.22578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096608.22580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096608.22612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096608.24516: stdout chunk (state=3): >>>ansible-tmp-1727096608.2205627-33926-222495103405928=/root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928 <<< 32980 1727096608.24626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096608.24654: stderr chunk (state=3): >>><<< 32980 1727096608.24657: stdout chunk (state=3): >>><<< 32980 1727096608.24674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096608.2205627-33926-222495103405928=/root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096608.24775: variable 'ansible_module_compression' from source: unknown 32980 1727096608.24778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 32980 1727096608.24786: variable 'ansible_facts' from source: unknown 32980 1727096608.24845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/AnsiballZ_service_facts.py 32980 1727096608.24947: Sending initial data 32980 1727096608.24951: Sent initial data (162 bytes) 32980 1727096608.25356: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096608.25363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096608.25394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096608.25397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096608.25399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096608.25401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096608.25455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096608.25460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096608.25472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096608.25494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096608.27190: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096608.27249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096608.27288: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmphjz0ndh4 /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/AnsiballZ_service_facts.py <<< 32980 1727096608.27292: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/AnsiballZ_service_facts.py" <<< 32980 1727096608.27323: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmphjz0ndh4" to remote "/root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/AnsiballZ_service_facts.py" <<< 32980 1727096608.28460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096608.28520: stderr chunk (state=3): >>><<< 32980 1727096608.28524: stdout chunk (state=3): >>><<< 32980 1727096608.28632: done transferring module to remote 32980 1727096608.28644: _low_level_execute_command(): starting 32980 1727096608.28650: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/ /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/AnsiballZ_service_facts.py && sleep 0' 32980 1727096608.29291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096608.29300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096608.29387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096608.29419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096608.29474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096608.29492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096608.29526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096608.31679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096608.31682: stdout chunk (state=3): >>><<< 32980 1727096608.31687: stderr chunk (state=3): >>><<< 32980 1727096608.31707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096608.31717: _low_level_execute_command(): starting 32980 1727096608.31782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/AnsiballZ_service_facts.py && sleep 0' 32980 1727096608.32382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096608.32398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096608.32414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096608.32485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096609.89518: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 32980 1727096609.89530: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 32980 1727096609.89607: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "<<< 32980 1727096609.89617: stdout chunk (state=3): >>>systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 32980 1727096609.91333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096609.91337: stdout chunk (state=3): >>><<< 32980 1727096609.91340: stderr chunk (state=3): >>><<< 32980 1727096609.91344: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096609.92477: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096609.92481: _low_level_execute_command(): starting 32980 1727096609.92484: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096608.2205627-33926-222495103405928/ > /dev/null 2>&1 && sleep 0' 32980 1727096609.93044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096609.93053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096609.93064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096609.93084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096609.93096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096609.93103: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096609.93112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096609.93130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096609.93138: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096609.93144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096609.93152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096609.93246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096609.93356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096609.93451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096609.95342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096609.95346: stdout chunk (state=3): >>><<< 32980 1727096609.95352: stderr chunk (state=3): >>><<< 32980 1727096609.95414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096609.95472: handler run complete 32980 1727096609.95828: variable 'ansible_facts' from source: unknown 32980 1727096609.96071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096609.97129: variable 'ansible_facts' from source: unknown 32980 1727096609.97481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096609.98002: attempt loop complete, returning result 32980 1727096609.98005: _execute() done 32980 1727096609.98007: dumping result to json 32980 1727096609.98009: done dumping result, returning 32980 1727096609.98272: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-457d-ef33-000000000888] 32980 1727096609.98276: sending task result for task 0afff68d-5257-457d-ef33-000000000888 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096609.99803: no more pending results, returning what we have 32980 1727096609.99806: results queue empty 32980 1727096609.99807: checking for any_errors_fatal 32980 1727096609.99810: done checking for any_errors_fatal 32980 1727096609.99811: checking for max_fail_percentage 32980 1727096609.99812: done checking for max_fail_percentage 32980 1727096609.99813: checking to see if all hosts have failed and the running result is not ok 32980 1727096609.99814: done checking to see if all hosts have failed 32980 1727096609.99815: getting the remaining hosts for this loop 32980 1727096609.99816: done getting the remaining hosts for this loop 32980 1727096609.99820: getting the next task for host managed_node2 32980 1727096609.99826: done getting next task for host managed_node2 32980 1727096609.99829: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 32980 1727096609.99835: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096609.99846: getting variables 32980 1727096609.99847: in VariableManager get_vars() 32980 1727096609.99883: Calling all_inventory to load vars for managed_node2 32980 1727096609.99886: Calling groups_inventory to load vars for managed_node2 32980 1727096609.99889: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096609.99897: Calling all_plugins_play to load vars for managed_node2 32980 1727096609.99900: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096609.99902: Calling groups_plugins_play to load vars for managed_node2 32980 1727096610.00481: done sending task result for task 0afff68d-5257-457d-ef33-000000000888 32980 1727096610.00997: WORKER PROCESS EXITING 32980 1727096610.02321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096610.05539: done with get_vars() 32980 1727096610.05564: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:03:30 -0400 (0:00:01.892) 0:00:21.985 ****** 32980 1727096610.05864: entering _queue_task() for managed_node2/package_facts 32980 1727096610.06715: worker is 1 (out of 1 available) 32980 1727096610.06727: exiting _queue_task() for managed_node2/package_facts 32980 1727096610.06739: done queuing things up, now waiting for results queue to drain 32980 1727096610.06740: waiting for pending results... 32980 1727096610.06883: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 32980 1727096610.07225: in run() - task 0afff68d-5257-457d-ef33-000000000889 32980 1727096610.07574: variable 'ansible_search_path' from source: unknown 32980 1727096610.07578: variable 'ansible_search_path' from source: unknown 32980 1727096610.07580: calling self._execute() 32980 1727096610.07583: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.07586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.07589: variable 'omit' from source: magic vars 32980 1727096610.08415: variable 'ansible_distribution_major_version' from source: facts 32980 1727096610.08434: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096610.08446: variable 'omit' from source: magic vars 32980 1727096610.08522: variable 'omit' from source: magic vars 32980 1727096610.08872: variable 'omit' from source: magic vars 32980 1727096610.08876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096610.08891: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096610.08913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096610.08937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096610.08955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096610.08992: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096610.09273: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.09276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.09294: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096610.09303: Set connection var ansible_timeout to 10 32980 1727096610.09308: Set connection var ansible_shell_type to sh 32980 1727096610.09312: Set connection var ansible_connection to ssh 32980 1727096610.09321: Set connection var ansible_shell_executable to /bin/sh 32980 1727096610.09327: Set connection var ansible_pipelining to False 32980 1727096610.09347: variable 'ansible_shell_executable' from source: unknown 32980 1727096610.09353: variable 'ansible_connection' from source: unknown 32980 1727096610.09358: variable 'ansible_module_compression' from source: unknown 32980 1727096610.09363: variable 'ansible_shell_type' from source: unknown 32980 1727096610.09370: variable 'ansible_shell_executable' from source: unknown 32980 1727096610.09377: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.09672: variable 'ansible_pipelining' from source: unknown 32980 1727096610.09676: variable 'ansible_timeout' from source: unknown 32980 1727096610.09678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.09780: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096610.10072: variable 'omit' from source: magic vars 32980 1727096610.10076: starting attempt loop 32980 1727096610.10078: running the handler 32980 1727096610.10081: _low_level_execute_command(): starting 32980 1727096610.10083: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096610.11386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.11472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096610.11486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096610.11537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096610.13257: stdout chunk (state=3): >>>/root <<< 32980 1727096610.13401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096610.13438: stderr chunk (state=3): >>><<< 32980 1727096610.13447: stdout chunk (state=3): >>><<< 32980 1727096610.13513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096610.13624: _low_level_execute_command(): starting 32980 1727096610.13636: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396 `" && echo ansible-tmp-1727096610.1360962-34005-151797259295396="` echo /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396 `" ) && sleep 0' 32980 1727096610.14773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096610.14776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096610.14778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096610.14781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096610.14783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096610.14785: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096610.14787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.14796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096610.14799: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096610.14804: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096610.14813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096610.14822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096610.14834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096610.14846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096610.14855: stderr chunk (state=3): >>>debug2: match found <<< 32980 1727096610.14864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.14937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096610.14972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096610.15212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096610.17036: stdout chunk (state=3): >>>ansible-tmp-1727096610.1360962-34005-151797259295396=/root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396 <<< 32980 1727096610.17172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096610.17181: stdout chunk (state=3): >>><<< 32980 1727096610.17188: stderr chunk (state=3): >>><<< 32980 1727096610.17206: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096610.1360962-34005-151797259295396=/root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096610.17254: variable 'ansible_module_compression' from source: unknown 32980 1727096610.17313: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 32980 1727096610.17486: variable 'ansible_facts' from source: unknown 32980 1727096610.17906: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/AnsiballZ_package_facts.py 32980 1727096610.18391: Sending initial data 32980 1727096610.18394: Sent initial data (162 bytes) 32980 1727096610.19528: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.19584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.19652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096610.19672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096610.19685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096610.19791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096610.21512: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096610.21564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096610.21600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpkw29ttbu /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/AnsiballZ_package_facts.py <<< 32980 1727096610.21604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/AnsiballZ_package_facts.py" <<< 32980 1727096610.21651: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpkw29ttbu" to remote "/root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/AnsiballZ_package_facts.py" <<< 32980 1727096610.21655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/AnsiballZ_package_facts.py" <<< 32980 1727096610.24312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096610.24632: stderr chunk (state=3): >>><<< 32980 1727096610.24636: stdout chunk (state=3): >>><<< 32980 1727096610.24638: done transferring module to remote 32980 1727096610.24641: _low_level_execute_command(): starting 32980 1727096610.24643: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/ /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/AnsiballZ_package_facts.py && sleep 0' 32980 1727096610.25834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.26008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096610.26020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096610.26059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096610.27904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096610.28032: stderr chunk (state=3): >>><<< 32980 1727096610.28040: stdout chunk (state=3): >>><<< 32980 1727096610.28090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096610.28113: _low_level_execute_command(): starting 32980 1727096610.28124: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/AnsiballZ_package_facts.py && sleep 0' 32980 1727096610.29283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096610.29462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096610.29465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.29473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096610.29484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096610.29587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096610.29604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096610.74540: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 32980 1727096610.74575: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 32980 1727096610.74608: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 32980 1727096610.74633: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 32980 1727096610.74655: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 32980 1727096610.74699: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 32980 1727096610.74712: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 32980 1727096610.74716: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 32980 1727096610.74744: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 32980 1727096610.74756: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 32980 1727096610.76562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096610.76598: stderr chunk (state=3): >>><<< 32980 1727096610.76601: stdout chunk (state=3): >>><<< 32980 1727096610.76733: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096610.78284: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096610.78289: _low_level_execute_command(): starting 32980 1727096610.78292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096610.1360962-34005-151797259295396/ > /dev/null 2>&1 && sleep 0' 32980 1727096610.78657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096610.78761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096610.78781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096610.78794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096610.78873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096610.80729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096610.80757: stderr chunk (state=3): >>><<< 32980 1727096610.80760: stdout chunk (state=3): >>><<< 32980 1727096610.80778: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096610.80785: handler run complete 32980 1727096610.81383: variable 'ansible_facts' from source: unknown 32980 1727096610.81900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096610.83039: variable 'ansible_facts' from source: unknown 32980 1727096610.83282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096610.83669: attempt loop complete, returning result 32980 1727096610.83678: _execute() done 32980 1727096610.83681: dumping result to json 32980 1727096610.83793: done dumping result, returning 32980 1727096610.83801: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-457d-ef33-000000000889] 32980 1727096610.83804: sending task result for task 0afff68d-5257-457d-ef33-000000000889 32980 1727096610.85085: done sending task result for task 0afff68d-5257-457d-ef33-000000000889 32980 1727096610.85088: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096610.85171: no more pending results, returning what we have 32980 1727096610.85174: results queue empty 32980 1727096610.85175: checking for any_errors_fatal 32980 1727096610.85178: done checking for any_errors_fatal 32980 1727096610.85178: checking for max_fail_percentage 32980 1727096610.85179: done checking for max_fail_percentage 32980 1727096610.85179: checking to see if all hosts have failed and the running result is not ok 32980 1727096610.85180: done checking to see if all hosts have failed 32980 1727096610.85180: getting the remaining hosts for this loop 32980 1727096610.85181: done getting the remaining hosts for this loop 32980 1727096610.85184: getting the next task for host managed_node2 32980 1727096610.85189: done getting next task for host managed_node2 32980 1727096610.85191: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 32980 1727096610.85194: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096610.85200: getting variables 32980 1727096610.85201: in VariableManager get_vars() 32980 1727096610.85226: Calling all_inventory to load vars for managed_node2 32980 1727096610.85228: Calling groups_inventory to load vars for managed_node2 32980 1727096610.85230: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096610.85236: Calling all_plugins_play to load vars for managed_node2 32980 1727096610.85238: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096610.85239: Calling groups_plugins_play to load vars for managed_node2 32980 1727096610.85914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096610.86789: done with get_vars() 32980 1727096610.86805: done getting variables 32980 1727096610.86851: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:03:30 -0400 (0:00:00.810) 0:00:22.795 ****** 32980 1727096610.86879: entering _queue_task() for managed_node2/debug 32980 1727096610.87126: worker is 1 (out of 1 available) 32980 1727096610.87139: exiting _queue_task() for managed_node2/debug 32980 1727096610.87150: done queuing things up, now waiting for results queue to drain 32980 1727096610.87151: waiting for pending results... 32980 1727096610.87331: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 32980 1727096610.87421: in run() - task 0afff68d-5257-457d-ef33-000000000066 32980 1727096610.87431: variable 'ansible_search_path' from source: unknown 32980 1727096610.87434: variable 'ansible_search_path' from source: unknown 32980 1727096610.87462: calling self._execute() 32980 1727096610.87536: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.87540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.87549: variable 'omit' from source: magic vars 32980 1727096610.87830: variable 'ansible_distribution_major_version' from source: facts 32980 1727096610.87840: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096610.87846: variable 'omit' from source: magic vars 32980 1727096610.87889: variable 'omit' from source: magic vars 32980 1727096610.87958: variable 'network_provider' from source: set_fact 32980 1727096610.87976: variable 'omit' from source: magic vars 32980 1727096610.88007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096610.88033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096610.88052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096610.88064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096610.88078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096610.88099: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096610.88102: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.88104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.88179: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096610.88182: Set connection var ansible_timeout to 10 32980 1727096610.88184: Set connection var ansible_shell_type to sh 32980 1727096610.88187: Set connection var ansible_connection to ssh 32980 1727096610.88193: Set connection var ansible_shell_executable to /bin/sh 32980 1727096610.88197: Set connection var ansible_pipelining to False 32980 1727096610.88213: variable 'ansible_shell_executable' from source: unknown 32980 1727096610.88216: variable 'ansible_connection' from source: unknown 32980 1727096610.88219: variable 'ansible_module_compression' from source: unknown 32980 1727096610.88221: variable 'ansible_shell_type' from source: unknown 32980 1727096610.88223: variable 'ansible_shell_executable' from source: unknown 32980 1727096610.88225: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.88228: variable 'ansible_pipelining' from source: unknown 32980 1727096610.88230: variable 'ansible_timeout' from source: unknown 32980 1727096610.88236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.88337: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096610.88347: variable 'omit' from source: magic vars 32980 1727096610.88350: starting attempt loop 32980 1727096610.88352: running the handler 32980 1727096610.88392: handler run complete 32980 1727096610.88403: attempt loop complete, returning result 32980 1727096610.88406: _execute() done 32980 1727096610.88408: dumping result to json 32980 1727096610.88411: done dumping result, returning 32980 1727096610.88417: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-457d-ef33-000000000066] 32980 1727096610.88421: sending task result for task 0afff68d-5257-457d-ef33-000000000066 32980 1727096610.88505: done sending task result for task 0afff68d-5257-457d-ef33-000000000066 32980 1727096610.88508: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 32980 1727096610.88563: no more pending results, returning what we have 32980 1727096610.88565: results queue empty 32980 1727096610.88566: checking for any_errors_fatal 32980 1727096610.88579: done checking for any_errors_fatal 32980 1727096610.88579: checking for max_fail_percentage 32980 1727096610.88581: done checking for max_fail_percentage 32980 1727096610.88582: checking to see if all hosts have failed and the running result is not ok 32980 1727096610.88583: done checking to see if all hosts have failed 32980 1727096610.88584: getting the remaining hosts for this loop 32980 1727096610.88585: done getting the remaining hosts for this loop 32980 1727096610.88588: getting the next task for host managed_node2 32980 1727096610.88595: done getting next task for host managed_node2 32980 1727096610.88598: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32980 1727096610.88601: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096610.88611: getting variables 32980 1727096610.88612: in VariableManager get_vars() 32980 1727096610.88646: Calling all_inventory to load vars for managed_node2 32980 1727096610.88650: Calling groups_inventory to load vars for managed_node2 32980 1727096610.88651: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096610.88660: Calling all_plugins_play to load vars for managed_node2 32980 1727096610.88662: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096610.88664: Calling groups_plugins_play to load vars for managed_node2 32980 1727096610.89534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096610.90414: done with get_vars() 32980 1727096610.90430: done getting variables 32980 1727096610.90476: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:03:30 -0400 (0:00:00.036) 0:00:22.831 ****** 32980 1727096610.90500: entering _queue_task() for managed_node2/fail 32980 1727096610.91072: worker is 1 (out of 1 available) 32980 1727096610.91084: exiting _queue_task() for managed_node2/fail 32980 1727096610.91096: done queuing things up, now waiting for results queue to drain 32980 1727096610.91097: waiting for pending results... 32980 1727096610.91286: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32980 1727096610.91291: in run() - task 0afff68d-5257-457d-ef33-000000000067 32980 1727096610.91451: variable 'ansible_search_path' from source: unknown 32980 1727096610.91454: variable 'ansible_search_path' from source: unknown 32980 1727096610.91457: calling self._execute() 32980 1727096610.91551: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.91576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.91587: variable 'omit' from source: magic vars 32980 1727096610.91904: variable 'ansible_distribution_major_version' from source: facts 32980 1727096610.91914: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096610.92005: variable 'network_state' from source: role '' defaults 32980 1727096610.92013: Evaluated conditional (network_state != {}): False 32980 1727096610.92016: when evaluation is False, skipping this task 32980 1727096610.92019: _execute() done 32980 1727096610.92021: dumping result to json 32980 1727096610.92024: done dumping result, returning 32980 1727096610.92031: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-457d-ef33-000000000067] 32980 1727096610.92036: sending task result for task 0afff68d-5257-457d-ef33-000000000067 32980 1727096610.92130: done sending task result for task 0afff68d-5257-457d-ef33-000000000067 32980 1727096610.92132: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096610.92179: no more pending results, returning what we have 32980 1727096610.92182: results queue empty 32980 1727096610.92183: checking for any_errors_fatal 32980 1727096610.92190: done checking for any_errors_fatal 32980 1727096610.92190: checking for max_fail_percentage 32980 1727096610.92192: done checking for max_fail_percentage 32980 1727096610.92193: checking to see if all hosts have failed and the running result is not ok 32980 1727096610.92194: done checking to see if all hosts have failed 32980 1727096610.92195: getting the remaining hosts for this loop 32980 1727096610.92196: done getting the remaining hosts for this loop 32980 1727096610.92199: getting the next task for host managed_node2 32980 1727096610.92207: done getting next task for host managed_node2 32980 1727096610.92210: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32980 1727096610.92213: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096610.92230: getting variables 32980 1727096610.92232: in VariableManager get_vars() 32980 1727096610.92270: Calling all_inventory to load vars for managed_node2 32980 1727096610.92272: Calling groups_inventory to load vars for managed_node2 32980 1727096610.92275: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096610.92284: Calling all_plugins_play to load vars for managed_node2 32980 1727096610.92286: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096610.92288: Calling groups_plugins_play to load vars for managed_node2 32980 1727096610.93042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096610.94290: done with get_vars() 32980 1727096610.94312: done getting variables 32980 1727096610.94371: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:03:30 -0400 (0:00:00.038) 0:00:22.870 ****** 32980 1727096610.94403: entering _queue_task() for managed_node2/fail 32980 1727096610.94893: worker is 1 (out of 1 available) 32980 1727096610.94902: exiting _queue_task() for managed_node2/fail 32980 1727096610.94913: done queuing things up, now waiting for results queue to drain 32980 1727096610.94914: waiting for pending results... 32980 1727096610.95151: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32980 1727096610.95155: in run() - task 0afff68d-5257-457d-ef33-000000000068 32980 1727096610.95158: variable 'ansible_search_path' from source: unknown 32980 1727096610.95160: variable 'ansible_search_path' from source: unknown 32980 1727096610.95198: calling self._execute() 32980 1727096610.95303: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096610.95313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096610.95327: variable 'omit' from source: magic vars 32980 1727096610.95704: variable 'ansible_distribution_major_version' from source: facts 32980 1727096610.95721: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096610.95849: variable 'network_state' from source: role '' defaults 32980 1727096610.95864: Evaluated conditional (network_state != {}): False 32980 1727096610.95873: when evaluation is False, skipping this task 32980 1727096610.95900: _execute() done 32980 1727096610.95902: dumping result to json 32980 1727096610.95905: done dumping result, returning 32980 1727096610.95908: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-457d-ef33-000000000068] 32980 1727096610.95911: sending task result for task 0afff68d-5257-457d-ef33-000000000068 32980 1727096610.96206: done sending task result for task 0afff68d-5257-457d-ef33-000000000068 32980 1727096610.96210: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096610.96247: no more pending results, returning what we have 32980 1727096610.96250: results queue empty 32980 1727096610.96251: checking for any_errors_fatal 32980 1727096610.96256: done checking for any_errors_fatal 32980 1727096610.96257: checking for max_fail_percentage 32980 1727096610.96258: done checking for max_fail_percentage 32980 1727096610.96259: checking to see if all hosts have failed and the running result is not ok 32980 1727096610.96260: done checking to see if all hosts have failed 32980 1727096610.96260: getting the remaining hosts for this loop 32980 1727096610.96262: done getting the remaining hosts for this loop 32980 1727096610.96264: getting the next task for host managed_node2 32980 1727096610.96272: done getting next task for host managed_node2 32980 1727096610.96275: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32980 1727096610.96279: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096610.96294: getting variables 32980 1727096610.96296: in VariableManager get_vars() 32980 1727096610.96331: Calling all_inventory to load vars for managed_node2 32980 1727096610.96334: Calling groups_inventory to load vars for managed_node2 32980 1727096610.96336: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096610.96345: Calling all_plugins_play to load vars for managed_node2 32980 1727096610.96348: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096610.96350: Calling groups_plugins_play to load vars for managed_node2 32980 1727096610.97747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096610.99265: done with get_vars() 32980 1727096610.99293: done getting variables 32980 1727096610.99354: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:03:30 -0400 (0:00:00.049) 0:00:22.920 ****** 32980 1727096610.99392: entering _queue_task() for managed_node2/fail 32980 1727096610.99739: worker is 1 (out of 1 available) 32980 1727096610.99752: exiting _queue_task() for managed_node2/fail 32980 1727096610.99765: done queuing things up, now waiting for results queue to drain 32980 1727096610.99766: waiting for pending results... 32980 1727096611.00063: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32980 1727096611.00220: in run() - task 0afff68d-5257-457d-ef33-000000000069 32980 1727096611.00240: variable 'ansible_search_path' from source: unknown 32980 1727096611.00247: variable 'ansible_search_path' from source: unknown 32980 1727096611.00289: calling self._execute() 32980 1727096611.00382: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.00392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.00404: variable 'omit' from source: magic vars 32980 1727096611.00771: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.00809: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.00979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096611.03374: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096611.03378: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096611.03380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096611.03383: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096611.03414: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096611.03501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.03546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.03578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.03628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.03649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.03756: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.03779: Evaluated conditional (ansible_distribution_major_version | int > 9): True 32980 1727096611.03900: variable 'ansible_distribution' from source: facts 32980 1727096611.03908: variable '__network_rh_distros' from source: role '' defaults 32980 1727096611.03921: Evaluated conditional (ansible_distribution in __network_rh_distros): True 32980 1727096611.04157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.04188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.04219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.04272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.04371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.04374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.04377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.04403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.04447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.04469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.04518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.04546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.04576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.04622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.04642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.04971: variable 'network_connections' from source: task vars 32980 1727096611.04988: variable 'interface' from source: play vars 32980 1727096611.05061: variable 'interface' from source: play vars 32980 1727096611.05131: variable 'vlan_interface' from source: play vars 32980 1727096611.05146: variable 'vlan_interface' from source: play vars 32980 1727096611.05160: variable 'network_state' from source: role '' defaults 32980 1727096611.05231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096611.05407: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096611.05449: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096611.05492: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096611.05525: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096611.05772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096611.05784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096611.05787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.05789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096611.05792: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 32980 1727096611.05795: when evaluation is False, skipping this task 32980 1727096611.05797: _execute() done 32980 1727096611.05799: dumping result to json 32980 1727096611.05801: done dumping result, returning 32980 1727096611.05803: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-457d-ef33-000000000069] 32980 1727096611.05806: sending task result for task 0afff68d-5257-457d-ef33-000000000069 32980 1727096611.05882: done sending task result for task 0afff68d-5257-457d-ef33-000000000069 32980 1727096611.05886: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 32980 1727096611.05935: no more pending results, returning what we have 32980 1727096611.05938: results queue empty 32980 1727096611.05940: checking for any_errors_fatal 32980 1727096611.05946: done checking for any_errors_fatal 32980 1727096611.05947: checking for max_fail_percentage 32980 1727096611.05949: done checking for max_fail_percentage 32980 1727096611.05950: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.05951: done checking to see if all hosts have failed 32980 1727096611.05951: getting the remaining hosts for this loop 32980 1727096611.05953: done getting the remaining hosts for this loop 32980 1727096611.05958: getting the next task for host managed_node2 32980 1727096611.05966: done getting next task for host managed_node2 32980 1727096611.05973: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32980 1727096611.05976: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.05993: getting variables 32980 1727096611.05995: in VariableManager get_vars() 32980 1727096611.06038: Calling all_inventory to load vars for managed_node2 32980 1727096611.06042: Calling groups_inventory to load vars for managed_node2 32980 1727096611.06044: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.06055: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.06059: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.06063: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.07647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.09350: done with get_vars() 32980 1727096611.09373: done getting variables 32980 1727096611.09430: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:03:31 -0400 (0:00:00.100) 0:00:23.021 ****** 32980 1727096611.09462: entering _queue_task() for managed_node2/dnf 32980 1727096611.09784: worker is 1 (out of 1 available) 32980 1727096611.09797: exiting _queue_task() for managed_node2/dnf 32980 1727096611.09809: done queuing things up, now waiting for results queue to drain 32980 1727096611.09810: waiting for pending results... 32980 1727096611.10193: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32980 1727096611.10224: in run() - task 0afff68d-5257-457d-ef33-00000000006a 32980 1727096611.10239: variable 'ansible_search_path' from source: unknown 32980 1727096611.10245: variable 'ansible_search_path' from source: unknown 32980 1727096611.10284: calling self._execute() 32980 1727096611.10574: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.10578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.10581: variable 'omit' from source: magic vars 32980 1727096611.10881: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.10893: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.11120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096611.13638: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096611.13713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096611.13757: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096611.13797: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096611.13863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096611.13927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.13959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.13996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.14039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.14057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.14179: variable 'ansible_distribution' from source: facts 32980 1727096611.14299: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.14302: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 32980 1727096611.14330: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096611.14465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.14496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.14529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.14573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.14592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.14639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.14669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.14698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.14744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.14760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.14804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.14829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.14861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.14903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.14920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.15086: variable 'network_connections' from source: task vars 32980 1727096611.15102: variable 'interface' from source: play vars 32980 1727096611.15172: variable 'interface' from source: play vars 32980 1727096611.15274: variable 'vlan_interface' from source: play vars 32980 1727096611.15277: variable 'vlan_interface' from source: play vars 32980 1727096611.15321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096611.15491: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096611.15531: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096611.15565: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096611.15602: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096611.15647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096611.15675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096611.15720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.15748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096611.15799: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096611.16065: variable 'network_connections' from source: task vars 32980 1727096611.16078: variable 'interface' from source: play vars 32980 1727096611.16143: variable 'interface' from source: play vars 32980 1727096611.16156: variable 'vlan_interface' from source: play vars 32980 1727096611.16253: variable 'vlan_interface' from source: play vars 32980 1727096611.16256: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32980 1727096611.16258: when evaluation is False, skipping this task 32980 1727096611.16260: _execute() done 32980 1727096611.16265: dumping result to json 32980 1727096611.16275: done dumping result, returning 32980 1727096611.16287: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-457d-ef33-00000000006a] 32980 1727096611.16296: sending task result for task 0afff68d-5257-457d-ef33-00000000006a skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32980 1727096611.16452: no more pending results, returning what we have 32980 1727096611.16455: results queue empty 32980 1727096611.16457: checking for any_errors_fatal 32980 1727096611.16470: done checking for any_errors_fatal 32980 1727096611.16471: checking for max_fail_percentage 32980 1727096611.16473: done checking for max_fail_percentage 32980 1727096611.16474: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.16475: done checking to see if all hosts have failed 32980 1727096611.16476: getting the remaining hosts for this loop 32980 1727096611.16477: done getting the remaining hosts for this loop 32980 1727096611.16482: getting the next task for host managed_node2 32980 1727096611.16491: done getting next task for host managed_node2 32980 1727096611.16495: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32980 1727096611.16498: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.16515: getting variables 32980 1727096611.16517: in VariableManager get_vars() 32980 1727096611.16559: Calling all_inventory to load vars for managed_node2 32980 1727096611.16562: Calling groups_inventory to load vars for managed_node2 32980 1727096611.16565: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.16781: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.16785: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.16788: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.17352: done sending task result for task 0afff68d-5257-457d-ef33-00000000006a 32980 1727096611.17356: WORKER PROCESS EXITING 32980 1727096611.18260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.19950: done with get_vars() 32980 1727096611.19980: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32980 1727096611.20060: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:03:31 -0400 (0:00:00.106) 0:00:23.127 ****** 32980 1727096611.20097: entering _queue_task() for managed_node2/yum 32980 1727096611.20601: worker is 1 (out of 1 available) 32980 1727096611.20612: exiting _queue_task() for managed_node2/yum 32980 1727096611.20624: done queuing things up, now waiting for results queue to drain 32980 1727096611.20625: waiting for pending results... 32980 1727096611.20864: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32980 1727096611.20928: in run() - task 0afff68d-5257-457d-ef33-00000000006b 32980 1727096611.20979: variable 'ansible_search_path' from source: unknown 32980 1727096611.20983: variable 'ansible_search_path' from source: unknown 32980 1727096611.21020: calling self._execute() 32980 1727096611.21124: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.21128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.21140: variable 'omit' from source: magic vars 32980 1727096611.21544: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.21575: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.21789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096611.23958: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096611.24024: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096611.24061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096611.24102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096611.24124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096611.24186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.24215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.24234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.24264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.24285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.24353: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.24366: Evaluated conditional (ansible_distribution_major_version | int < 8): False 32980 1727096611.24371: when evaluation is False, skipping this task 32980 1727096611.24378: _execute() done 32980 1727096611.24381: dumping result to json 32980 1727096611.24383: done dumping result, returning 32980 1727096611.24386: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-457d-ef33-00000000006b] 32980 1727096611.24394: sending task result for task 0afff68d-5257-457d-ef33-00000000006b skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 32980 1727096611.24529: no more pending results, returning what we have 32980 1727096611.24532: results queue empty 32980 1727096611.24533: checking for any_errors_fatal 32980 1727096611.24539: done checking for any_errors_fatal 32980 1727096611.24540: checking for max_fail_percentage 32980 1727096611.24542: done checking for max_fail_percentage 32980 1727096611.24542: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.24543: done checking to see if all hosts have failed 32980 1727096611.24544: getting the remaining hosts for this loop 32980 1727096611.24545: done getting the remaining hosts for this loop 32980 1727096611.24549: getting the next task for host managed_node2 32980 1727096611.24558: done getting next task for host managed_node2 32980 1727096611.24561: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32980 1727096611.24563: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.24583: getting variables 32980 1727096611.24584: in VariableManager get_vars() 32980 1727096611.24626: Calling all_inventory to load vars for managed_node2 32980 1727096611.24629: Calling groups_inventory to load vars for managed_node2 32980 1727096611.24631: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.24640: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.24643: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.24645: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.25181: done sending task result for task 0afff68d-5257-457d-ef33-00000000006b 32980 1727096611.25185: WORKER PROCESS EXITING 32980 1727096611.25599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.26904: done with get_vars() 32980 1727096611.26926: done getting variables 32980 1727096611.26982: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:03:31 -0400 (0:00:00.069) 0:00:23.196 ****** 32980 1727096611.27015: entering _queue_task() for managed_node2/fail 32980 1727096611.27261: worker is 1 (out of 1 available) 32980 1727096611.27276: exiting _queue_task() for managed_node2/fail 32980 1727096611.27288: done queuing things up, now waiting for results queue to drain 32980 1727096611.27289: waiting for pending results... 32980 1727096611.27476: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32980 1727096611.27573: in run() - task 0afff68d-5257-457d-ef33-00000000006c 32980 1727096611.27587: variable 'ansible_search_path' from source: unknown 32980 1727096611.27591: variable 'ansible_search_path' from source: unknown 32980 1727096611.27619: calling self._execute() 32980 1727096611.27693: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.27697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.27706: variable 'omit' from source: magic vars 32980 1727096611.27989: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.27998: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.28079: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096611.28210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096611.30047: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096611.30080: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096611.30119: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096611.30167: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096611.30202: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096611.30296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.30342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.30429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.30433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.30443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.30502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.30516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.30537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.30561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.30574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.30607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.30623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.30639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.30679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.30689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.30807: variable 'network_connections' from source: task vars 32980 1727096611.30812: variable 'interface' from source: play vars 32980 1727096611.30857: variable 'interface' from source: play vars 32980 1727096611.30866: variable 'vlan_interface' from source: play vars 32980 1727096611.30912: variable 'vlan_interface' from source: play vars 32980 1727096611.30960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096611.31071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096611.31099: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096611.31120: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096611.31142: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096611.31177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096611.31191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096611.31208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.31225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096611.31269: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096611.31416: variable 'network_connections' from source: task vars 32980 1727096611.31419: variable 'interface' from source: play vars 32980 1727096611.31463: variable 'interface' from source: play vars 32980 1727096611.31466: variable 'vlan_interface' from source: play vars 32980 1727096611.31513: variable 'vlan_interface' from source: play vars 32980 1727096611.31532: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32980 1727096611.31535: when evaluation is False, skipping this task 32980 1727096611.31538: _execute() done 32980 1727096611.31540: dumping result to json 32980 1727096611.31543: done dumping result, returning 32980 1727096611.31550: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-457d-ef33-00000000006c] 32980 1727096611.31561: sending task result for task 0afff68d-5257-457d-ef33-00000000006c 32980 1727096611.31642: done sending task result for task 0afff68d-5257-457d-ef33-00000000006c 32980 1727096611.31644: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32980 1727096611.31697: no more pending results, returning what we have 32980 1727096611.31700: results queue empty 32980 1727096611.31701: checking for any_errors_fatal 32980 1727096611.31708: done checking for any_errors_fatal 32980 1727096611.31708: checking for max_fail_percentage 32980 1727096611.31710: done checking for max_fail_percentage 32980 1727096611.31711: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.31712: done checking to see if all hosts have failed 32980 1727096611.31713: getting the remaining hosts for this loop 32980 1727096611.31714: done getting the remaining hosts for this loop 32980 1727096611.31718: getting the next task for host managed_node2 32980 1727096611.31726: done getting next task for host managed_node2 32980 1727096611.31730: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 32980 1727096611.31732: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.31748: getting variables 32980 1727096611.31749: in VariableManager get_vars() 32980 1727096611.31794: Calling all_inventory to load vars for managed_node2 32980 1727096611.31796: Calling groups_inventory to load vars for managed_node2 32980 1727096611.31799: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.31808: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.31810: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.31813: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.32901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.34278: done with get_vars() 32980 1727096611.34297: done getting variables 32980 1727096611.34343: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:03:31 -0400 (0:00:00.073) 0:00:23.270 ****** 32980 1727096611.34376: entering _queue_task() for managed_node2/package 32980 1727096611.34623: worker is 1 (out of 1 available) 32980 1727096611.34637: exiting _queue_task() for managed_node2/package 32980 1727096611.34649: done queuing things up, now waiting for results queue to drain 32980 1727096611.34651: waiting for pending results... 32980 1727096611.34829: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 32980 1727096611.34912: in run() - task 0afff68d-5257-457d-ef33-00000000006d 32980 1727096611.34925: variable 'ansible_search_path' from source: unknown 32980 1727096611.34929: variable 'ansible_search_path' from source: unknown 32980 1727096611.34957: calling self._execute() 32980 1727096611.35035: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.35039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.35050: variable 'omit' from source: magic vars 32980 1727096611.35323: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.35333: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.35464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096611.35656: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096611.35692: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096611.35717: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096611.35770: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096611.35850: variable 'network_packages' from source: role '' defaults 32980 1727096611.35926: variable '__network_provider_setup' from source: role '' defaults 32980 1727096611.35935: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096611.35983: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096611.35991: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096611.36033: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096611.36283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096611.38095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096611.38137: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096611.38164: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096611.38192: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096611.38211: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096611.38273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.38295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.38312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.38337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.38349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.38386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.38402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.38417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.38441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.38451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.38599: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32980 1727096611.38673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.38695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.38711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.38735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.38745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.38811: variable 'ansible_python' from source: facts 32980 1727096611.38830: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32980 1727096611.38889: variable '__network_wpa_supplicant_required' from source: role '' defaults 32980 1727096611.38946: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32980 1727096611.39031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.39047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.39064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.39093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.39103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.39137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.39157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.39174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.39201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.39211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.39309: variable 'network_connections' from source: task vars 32980 1727096611.39313: variable 'interface' from source: play vars 32980 1727096611.39387: variable 'interface' from source: play vars 32980 1727096611.39396: variable 'vlan_interface' from source: play vars 32980 1727096611.39465: variable 'vlan_interface' from source: play vars 32980 1727096611.39517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096611.39536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096611.39559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.39585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096611.39621: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096611.39798: variable 'network_connections' from source: task vars 32980 1727096611.39802: variable 'interface' from source: play vars 32980 1727096611.39870: variable 'interface' from source: play vars 32980 1727096611.39884: variable 'vlan_interface' from source: play vars 32980 1727096611.39947: variable 'vlan_interface' from source: play vars 32980 1727096611.39971: variable '__network_packages_default_wireless' from source: role '' defaults 32980 1727096611.40041: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096611.40234: variable 'network_connections' from source: task vars 32980 1727096611.40238: variable 'interface' from source: play vars 32980 1727096611.40286: variable 'interface' from source: play vars 32980 1727096611.40292: variable 'vlan_interface' from source: play vars 32980 1727096611.40339: variable 'vlan_interface' from source: play vars 32980 1727096611.40355: variable '__network_packages_default_team' from source: role '' defaults 32980 1727096611.40411: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096611.40604: variable 'network_connections' from source: task vars 32980 1727096611.40607: variable 'interface' from source: play vars 32980 1727096611.40654: variable 'interface' from source: play vars 32980 1727096611.40660: variable 'vlan_interface' from source: play vars 32980 1727096611.40708: variable 'vlan_interface' from source: play vars 32980 1727096611.40745: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096611.40790: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096611.40796: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096611.40836: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096611.40973: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32980 1727096611.41261: variable 'network_connections' from source: task vars 32980 1727096611.41265: variable 'interface' from source: play vars 32980 1727096611.41313: variable 'interface' from source: play vars 32980 1727096611.41319: variable 'vlan_interface' from source: play vars 32980 1727096611.41360: variable 'vlan_interface' from source: play vars 32980 1727096611.41366: variable 'ansible_distribution' from source: facts 32980 1727096611.41371: variable '__network_rh_distros' from source: role '' defaults 32980 1727096611.41379: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.41390: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32980 1727096611.41504: variable 'ansible_distribution' from source: facts 32980 1727096611.41509: variable '__network_rh_distros' from source: role '' defaults 32980 1727096611.41511: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.41523: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32980 1727096611.41627: variable 'ansible_distribution' from source: facts 32980 1727096611.41631: variable '__network_rh_distros' from source: role '' defaults 32980 1727096611.41633: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.41659: variable 'network_provider' from source: set_fact 32980 1727096611.41670: variable 'ansible_facts' from source: unknown 32980 1727096611.42038: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 32980 1727096611.42041: when evaluation is False, skipping this task 32980 1727096611.42043: _execute() done 32980 1727096611.42046: dumping result to json 32980 1727096611.42048: done dumping result, returning 32980 1727096611.42057: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-457d-ef33-00000000006d] 32980 1727096611.42059: sending task result for task 0afff68d-5257-457d-ef33-00000000006d 32980 1727096611.42152: done sending task result for task 0afff68d-5257-457d-ef33-00000000006d 32980 1727096611.42155: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 32980 1727096611.42215: no more pending results, returning what we have 32980 1727096611.42218: results queue empty 32980 1727096611.42219: checking for any_errors_fatal 32980 1727096611.42227: done checking for any_errors_fatal 32980 1727096611.42228: checking for max_fail_percentage 32980 1727096611.42230: done checking for max_fail_percentage 32980 1727096611.42231: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.42231: done checking to see if all hosts have failed 32980 1727096611.42232: getting the remaining hosts for this loop 32980 1727096611.42234: done getting the remaining hosts for this loop 32980 1727096611.42237: getting the next task for host managed_node2 32980 1727096611.42247: done getting next task for host managed_node2 32980 1727096611.42251: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32980 1727096611.42254: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.42278: getting variables 32980 1727096611.42280: in VariableManager get_vars() 32980 1727096611.42319: Calling all_inventory to load vars for managed_node2 32980 1727096611.42321: Calling groups_inventory to load vars for managed_node2 32980 1727096611.42324: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.42333: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.42336: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.42339: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.43644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.45348: done with get_vars() 32980 1727096611.45380: done getting variables 32980 1727096611.45456: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:03:31 -0400 (0:00:00.111) 0:00:23.381 ****** 32980 1727096611.45497: entering _queue_task() for managed_node2/package 32980 1727096611.45996: worker is 1 (out of 1 available) 32980 1727096611.46008: exiting _queue_task() for managed_node2/package 32980 1727096611.46020: done queuing things up, now waiting for results queue to drain 32980 1727096611.46021: waiting for pending results... 32980 1727096611.46385: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32980 1727096611.46401: in run() - task 0afff68d-5257-457d-ef33-00000000006e 32980 1727096611.46414: variable 'ansible_search_path' from source: unknown 32980 1727096611.46418: variable 'ansible_search_path' from source: unknown 32980 1727096611.46455: calling self._execute() 32980 1727096611.46575: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.46594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.46605: variable 'omit' from source: magic vars 32980 1727096611.47037: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.47049: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.47180: variable 'network_state' from source: role '' defaults 32980 1727096611.47189: Evaluated conditional (network_state != {}): False 32980 1727096611.47192: when evaluation is False, skipping this task 32980 1727096611.47195: _execute() done 32980 1727096611.47198: dumping result to json 32980 1727096611.47200: done dumping result, returning 32980 1727096611.47208: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-457d-ef33-00000000006e] 32980 1727096611.47212: sending task result for task 0afff68d-5257-457d-ef33-00000000006e 32980 1727096611.47314: done sending task result for task 0afff68d-5257-457d-ef33-00000000006e 32980 1727096611.47316: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096611.47396: no more pending results, returning what we have 32980 1727096611.47399: results queue empty 32980 1727096611.47400: checking for any_errors_fatal 32980 1727096611.47406: done checking for any_errors_fatal 32980 1727096611.47407: checking for max_fail_percentage 32980 1727096611.47409: done checking for max_fail_percentage 32980 1727096611.47409: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.47410: done checking to see if all hosts have failed 32980 1727096611.47411: getting the remaining hosts for this loop 32980 1727096611.47413: done getting the remaining hosts for this loop 32980 1727096611.47416: getting the next task for host managed_node2 32980 1727096611.47424: done getting next task for host managed_node2 32980 1727096611.47427: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32980 1727096611.47430: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.47446: getting variables 32980 1727096611.47448: in VariableManager get_vars() 32980 1727096611.47494: Calling all_inventory to load vars for managed_node2 32980 1727096611.47496: Calling groups_inventory to load vars for managed_node2 32980 1727096611.47498: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.47507: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.47509: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.47512: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.48276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.49142: done with get_vars() 32980 1727096611.49159: done getting variables 32980 1727096611.49207: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:03:31 -0400 (0:00:00.037) 0:00:23.419 ****** 32980 1727096611.49232: entering _queue_task() for managed_node2/package 32980 1727096611.49471: worker is 1 (out of 1 available) 32980 1727096611.49486: exiting _queue_task() for managed_node2/package 32980 1727096611.49499: done queuing things up, now waiting for results queue to drain 32980 1727096611.49500: waiting for pending results... 32980 1727096611.49682: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32980 1727096611.49770: in run() - task 0afff68d-5257-457d-ef33-00000000006f 32980 1727096611.49782: variable 'ansible_search_path' from source: unknown 32980 1727096611.49786: variable 'ansible_search_path' from source: unknown 32980 1727096611.49812: calling self._execute() 32980 1727096611.49892: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.49896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.49905: variable 'omit' from source: magic vars 32980 1727096611.50180: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.50189: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.50270: variable 'network_state' from source: role '' defaults 32980 1727096611.50285: Evaluated conditional (network_state != {}): False 32980 1727096611.50288: when evaluation is False, skipping this task 32980 1727096611.50291: _execute() done 32980 1727096611.50294: dumping result to json 32980 1727096611.50296: done dumping result, returning 32980 1727096611.50299: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-457d-ef33-00000000006f] 32980 1727096611.50301: sending task result for task 0afff68d-5257-457d-ef33-00000000006f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096611.50435: no more pending results, returning what we have 32980 1727096611.50438: results queue empty 32980 1727096611.50439: checking for any_errors_fatal 32980 1727096611.50448: done checking for any_errors_fatal 32980 1727096611.50448: checking for max_fail_percentage 32980 1727096611.50450: done checking for max_fail_percentage 32980 1727096611.50450: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.50451: done checking to see if all hosts have failed 32980 1727096611.50452: getting the remaining hosts for this loop 32980 1727096611.50453: done getting the remaining hosts for this loop 32980 1727096611.50457: getting the next task for host managed_node2 32980 1727096611.50465: done getting next task for host managed_node2 32980 1727096611.50471: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32980 1727096611.50476: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.50494: getting variables 32980 1727096611.50496: in VariableManager get_vars() 32980 1727096611.50528: Calling all_inventory to load vars for managed_node2 32980 1727096611.50531: Calling groups_inventory to load vars for managed_node2 32980 1727096611.50533: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.50541: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.50544: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.50546: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.51081: done sending task result for task 0afff68d-5257-457d-ef33-00000000006f 32980 1727096611.51085: WORKER PROCESS EXITING 32980 1727096611.51424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.52289: done with get_vars() 32980 1727096611.52306: done getting variables 32980 1727096611.52347: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:03:31 -0400 (0:00:00.031) 0:00:23.450 ****** 32980 1727096611.52375: entering _queue_task() for managed_node2/service 32980 1727096611.52611: worker is 1 (out of 1 available) 32980 1727096611.52625: exiting _queue_task() for managed_node2/service 32980 1727096611.52636: done queuing things up, now waiting for results queue to drain 32980 1727096611.52638: waiting for pending results... 32980 1727096611.52812: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32980 1727096611.52899: in run() - task 0afff68d-5257-457d-ef33-000000000070 32980 1727096611.52910: variable 'ansible_search_path' from source: unknown 32980 1727096611.52914: variable 'ansible_search_path' from source: unknown 32980 1727096611.52943: calling self._execute() 32980 1727096611.53020: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.53024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.53034: variable 'omit' from source: magic vars 32980 1727096611.53309: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.53316: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.53397: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096611.53557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096611.55099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096611.55295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096611.55298: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096611.55301: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096611.55304: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096611.55333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.55376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.55405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.55445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.55461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.55511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.55534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.55642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.55645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.55647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.55650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.55666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.55694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.55730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.55750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.55915: variable 'network_connections' from source: task vars 32980 1727096611.55929: variable 'interface' from source: play vars 32980 1727096611.56000: variable 'interface' from source: play vars 32980 1727096611.56011: variable 'vlan_interface' from source: play vars 32980 1727096611.56104: variable 'vlan_interface' from source: play vars 32980 1727096611.56141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096611.56298: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096611.56338: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096611.56361: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096611.56388: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096611.56422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096611.56437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096611.56454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.56473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096611.56521: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096611.56678: variable 'network_connections' from source: task vars 32980 1727096611.56683: variable 'interface' from source: play vars 32980 1727096611.56730: variable 'interface' from source: play vars 32980 1727096611.56733: variable 'vlan_interface' from source: play vars 32980 1727096611.56780: variable 'vlan_interface' from source: play vars 32980 1727096611.56798: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32980 1727096611.56801: when evaluation is False, skipping this task 32980 1727096611.56803: _execute() done 32980 1727096611.56806: dumping result to json 32980 1727096611.56808: done dumping result, returning 32980 1727096611.56816: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-457d-ef33-000000000070] 32980 1727096611.56826: sending task result for task 0afff68d-5257-457d-ef33-000000000070 32980 1727096611.56912: done sending task result for task 0afff68d-5257-457d-ef33-000000000070 32980 1727096611.56915: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32980 1727096611.56996: no more pending results, returning what we have 32980 1727096611.56999: results queue empty 32980 1727096611.57000: checking for any_errors_fatal 32980 1727096611.57005: done checking for any_errors_fatal 32980 1727096611.57006: checking for max_fail_percentage 32980 1727096611.57007: done checking for max_fail_percentage 32980 1727096611.57008: checking to see if all hosts have failed and the running result is not ok 32980 1727096611.57009: done checking to see if all hosts have failed 32980 1727096611.57010: getting the remaining hosts for this loop 32980 1727096611.57011: done getting the remaining hosts for this loop 32980 1727096611.57015: getting the next task for host managed_node2 32980 1727096611.57023: done getting next task for host managed_node2 32980 1727096611.57026: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32980 1727096611.57029: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096611.57046: getting variables 32980 1727096611.57051: in VariableManager get_vars() 32980 1727096611.57091: Calling all_inventory to load vars for managed_node2 32980 1727096611.57094: Calling groups_inventory to load vars for managed_node2 32980 1727096611.57095: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096611.57105: Calling all_plugins_play to load vars for managed_node2 32980 1727096611.57107: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096611.57109: Calling groups_plugins_play to load vars for managed_node2 32980 1727096611.57912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096611.59444: done with get_vars() 32980 1727096611.59470: done getting variables 32980 1727096611.59530: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:03:31 -0400 (0:00:00.071) 0:00:23.522 ****** 32980 1727096611.59563: entering _queue_task() for managed_node2/service 32980 1727096611.59911: worker is 1 (out of 1 available) 32980 1727096611.59925: exiting _queue_task() for managed_node2/service 32980 1727096611.59938: done queuing things up, now waiting for results queue to drain 32980 1727096611.59939: waiting for pending results... 32980 1727096611.60386: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32980 1727096611.60390: in run() - task 0afff68d-5257-457d-ef33-000000000071 32980 1727096611.60393: variable 'ansible_search_path' from source: unknown 32980 1727096611.60402: variable 'ansible_search_path' from source: unknown 32980 1727096611.60442: calling self._execute() 32980 1727096611.60550: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.60561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.60579: variable 'omit' from source: magic vars 32980 1727096611.60960: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.60982: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096611.61146: variable 'network_provider' from source: set_fact 32980 1727096611.61161: variable 'network_state' from source: role '' defaults 32980 1727096611.61177: Evaluated conditional (network_provider == "nm" or network_state != {}): True 32980 1727096611.61187: variable 'omit' from source: magic vars 32980 1727096611.61243: variable 'omit' from source: magic vars 32980 1727096611.61281: variable 'network_service_name' from source: role '' defaults 32980 1727096611.61349: variable 'network_service_name' from source: role '' defaults 32980 1727096611.61461: variable '__network_provider_setup' from source: role '' defaults 32980 1727096611.61475: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096611.61574: variable '__network_service_name_default_nm' from source: role '' defaults 32980 1727096611.61577: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096611.61618: variable '__network_packages_default_nm' from source: role '' defaults 32980 1727096611.61849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096611.63984: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096611.64074: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096611.64089: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096611.64138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096611.64174: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096611.64274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.64305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.64372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.64381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.64397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.64444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.64469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.64496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.64540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.64558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.64842: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32980 1727096611.64928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.64960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.64991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.65032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.65051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.65150: variable 'ansible_python' from source: facts 32980 1727096611.65190: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32980 1727096611.65373: variable '__network_wpa_supplicant_required' from source: role '' defaults 32980 1727096611.65377: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32980 1727096611.65494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.65522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.65549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.65596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.65618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.65666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096611.65708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096611.65740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.65825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096611.65828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096611.65949: variable 'network_connections' from source: task vars 32980 1727096611.65961: variable 'interface' from source: play vars 32980 1727096611.66039: variable 'interface' from source: play vars 32980 1727096611.66059: variable 'vlan_interface' from source: play vars 32980 1727096611.66134: variable 'vlan_interface' from source: play vars 32980 1727096611.66263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096611.66433: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096611.66493: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096611.66537: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096611.66772: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096611.66775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096611.66778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096611.66780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096611.66782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096611.66813: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096611.67090: variable 'network_connections' from source: task vars 32980 1727096611.67101: variable 'interface' from source: play vars 32980 1727096611.67181: variable 'interface' from source: play vars 32980 1727096611.67197: variable 'vlan_interface' from source: play vars 32980 1727096611.67276: variable 'vlan_interface' from source: play vars 32980 1727096611.67312: variable '__network_packages_default_wireless' from source: role '' defaults 32980 1727096611.67399: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096611.67694: variable 'network_connections' from source: task vars 32980 1727096611.67704: variable 'interface' from source: play vars 32980 1727096611.67778: variable 'interface' from source: play vars 32980 1727096611.67791: variable 'vlan_interface' from source: play vars 32980 1727096611.67859: variable 'vlan_interface' from source: play vars 32980 1727096611.67891: variable '__network_packages_default_team' from source: role '' defaults 32980 1727096611.67973: variable '__network_team_connections_defined' from source: role '' defaults 32980 1727096611.68270: variable 'network_connections' from source: task vars 32980 1727096611.68281: variable 'interface' from source: play vars 32980 1727096611.68353: variable 'interface' from source: play vars 32980 1727096611.68365: variable 'vlan_interface' from source: play vars 32980 1727096611.68524: variable 'vlan_interface' from source: play vars 32980 1727096611.68527: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096611.68563: variable '__network_service_name_default_initscripts' from source: role '' defaults 32980 1727096611.68578: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096611.68642: variable '__network_packages_default_initscripts' from source: role '' defaults 32980 1727096611.68860: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32980 1727096611.69348: variable 'network_connections' from source: task vars 32980 1727096611.69357: variable 'interface' from source: play vars 32980 1727096611.69422: variable 'interface' from source: play vars 32980 1727096611.69434: variable 'vlan_interface' from source: play vars 32980 1727096611.69500: variable 'vlan_interface' from source: play vars 32980 1727096611.69512: variable 'ansible_distribution' from source: facts 32980 1727096611.69520: variable '__network_rh_distros' from source: role '' defaults 32980 1727096611.69529: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.69546: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32980 1727096611.69728: variable 'ansible_distribution' from source: facts 32980 1727096611.69736: variable '__network_rh_distros' from source: role '' defaults 32980 1727096611.69745: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.69761: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32980 1727096611.70039: variable 'ansible_distribution' from source: facts 32980 1727096611.70042: variable '__network_rh_distros' from source: role '' defaults 32980 1727096611.70044: variable 'ansible_distribution_major_version' from source: facts 32980 1727096611.70046: variable 'network_provider' from source: set_fact 32980 1727096611.70048: variable 'omit' from source: magic vars 32980 1727096611.70050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096611.70078: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096611.70101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096611.70121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096611.70135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096611.70173: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096611.70181: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.70188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.70295: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096611.70305: Set connection var ansible_timeout to 10 32980 1727096611.70312: Set connection var ansible_shell_type to sh 32980 1727096611.70317: Set connection var ansible_connection to ssh 32980 1727096611.70327: Set connection var ansible_shell_executable to /bin/sh 32980 1727096611.70335: Set connection var ansible_pipelining to False 32980 1727096611.70364: variable 'ansible_shell_executable' from source: unknown 32980 1727096611.70373: variable 'ansible_connection' from source: unknown 32980 1727096611.70381: variable 'ansible_module_compression' from source: unknown 32980 1727096611.70387: variable 'ansible_shell_type' from source: unknown 32980 1727096611.70393: variable 'ansible_shell_executable' from source: unknown 32980 1727096611.70404: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096611.70411: variable 'ansible_pipelining' from source: unknown 32980 1727096611.70417: variable 'ansible_timeout' from source: unknown 32980 1727096611.70424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096611.70532: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096611.70581: variable 'omit' from source: magic vars 32980 1727096611.70585: starting attempt loop 32980 1727096611.70587: running the handler 32980 1727096611.70643: variable 'ansible_facts' from source: unknown 32980 1727096611.71436: _low_level_execute_command(): starting 32980 1727096611.71453: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096611.72218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096611.72233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096611.72342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096611.72381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096611.72452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096611.74152: stdout chunk (state=3): >>>/root <<< 32980 1727096611.74254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096611.74287: stderr chunk (state=3): >>><<< 32980 1727096611.74292: stdout chunk (state=3): >>><<< 32980 1727096611.74309: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096611.74320: _low_level_execute_command(): starting 32980 1727096611.74326: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699 `" && echo ansible-tmp-1727096611.7431035-34074-94819241004699="` echo /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699 `" ) && sleep 0' 32980 1727096611.74749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096611.74756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096611.74784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096611.74787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096611.74837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096611.74843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096611.74846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096611.74880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096611.76839: stdout chunk (state=3): >>>ansible-tmp-1727096611.7431035-34074-94819241004699=/root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699 <<< 32980 1727096611.76944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096611.76975: stderr chunk (state=3): >>><<< 32980 1727096611.76979: stdout chunk (state=3): >>><<< 32980 1727096611.76993: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096611.7431035-34074-94819241004699=/root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096611.77020: variable 'ansible_module_compression' from source: unknown 32980 1727096611.77062: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 32980 1727096611.77116: variable 'ansible_facts' from source: unknown 32980 1727096611.77250: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/AnsiballZ_systemd.py 32980 1727096611.77355: Sending initial data 32980 1727096611.77358: Sent initial data (155 bytes) 32980 1727096611.77813: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096611.77858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096611.77865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096611.77897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096611.79502: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096611.79529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096611.79562: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmppu0m6la3 /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/AnsiballZ_systemd.py <<< 32980 1727096611.79564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/AnsiballZ_systemd.py" <<< 32980 1727096611.79597: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmppu0m6la3" to remote "/root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/AnsiballZ_systemd.py" <<< 32980 1727096611.79601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/AnsiballZ_systemd.py" <<< 32980 1727096611.80593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096611.80631: stderr chunk (state=3): >>><<< 32980 1727096611.80634: stdout chunk (state=3): >>><<< 32980 1727096611.80671: done transferring module to remote 32980 1727096611.80681: _low_level_execute_command(): starting 32980 1727096611.80686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/ /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/AnsiballZ_systemd.py && sleep 0' 32980 1727096611.81138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096611.81141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096611.81144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096611.81204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096611.81220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096611.81278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096611.83065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096611.83089: stderr chunk (state=3): >>><<< 32980 1727096611.83092: stdout chunk (state=3): >>><<< 32980 1727096611.83104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096611.83107: _low_level_execute_command(): starting 32980 1727096611.83112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/AnsiballZ_systemd.py && sleep 0' 32980 1727096611.83533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096611.83537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096611.83539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096611.83541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096611.83543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096611.83601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096611.83607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096611.83617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096611.83641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096612.13399: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4743168", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311243264", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1986244000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 32980 1727096612.13427: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "syst<<< 32980 1727096612.13436: stdout chunk (state=3): >>>emd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 32980 1727096612.15479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096612.15483: stdout chunk (state=3): >>><<< 32980 1727096612.15486: stderr chunk (state=3): >>><<< 32980 1727096612.15519: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4743168", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311243264", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1986244000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096612.15638: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096612.15653: _low_level_execute_command(): starting 32980 1727096612.15657: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096611.7431035-34074-94819241004699/ > /dev/null 2>&1 && sleep 0' 32980 1727096612.16101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096612.16106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.16109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096612.16112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.16164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096612.16174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096612.16177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096612.16208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096612.18153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096612.18157: stdout chunk (state=3): >>><<< 32980 1727096612.18159: stderr chunk (state=3): >>><<< 32980 1727096612.18374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096612.18378: handler run complete 32980 1727096612.18380: attempt loop complete, returning result 32980 1727096612.18382: _execute() done 32980 1727096612.18384: dumping result to json 32980 1727096612.18386: done dumping result, returning 32980 1727096612.18388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-457d-ef33-000000000071] 32980 1727096612.18390: sending task result for task 0afff68d-5257-457d-ef33-000000000071 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096612.18736: no more pending results, returning what we have 32980 1727096612.18740: results queue empty 32980 1727096612.18740: checking for any_errors_fatal 32980 1727096612.18747: done checking for any_errors_fatal 32980 1727096612.18747: checking for max_fail_percentage 32980 1727096612.18749: done checking for max_fail_percentage 32980 1727096612.18750: checking to see if all hosts have failed and the running result is not ok 32980 1727096612.18750: done checking to see if all hosts have failed 32980 1727096612.18751: getting the remaining hosts for this loop 32980 1727096612.18752: done getting the remaining hosts for this loop 32980 1727096612.18756: getting the next task for host managed_node2 32980 1727096612.18771: done getting next task for host managed_node2 32980 1727096612.18775: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32980 1727096612.18777: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096612.18788: getting variables 32980 1727096612.18790: in VariableManager get_vars() 32980 1727096612.18882: Calling all_inventory to load vars for managed_node2 32980 1727096612.18885: Calling groups_inventory to load vars for managed_node2 32980 1727096612.18888: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096612.18894: done sending task result for task 0afff68d-5257-457d-ef33-000000000071 32980 1727096612.18896: WORKER PROCESS EXITING 32980 1727096612.18905: Calling all_plugins_play to load vars for managed_node2 32980 1727096612.18907: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096612.18910: Calling groups_plugins_play to load vars for managed_node2 32980 1727096612.19707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096612.20600: done with get_vars() 32980 1727096612.20617: done getting variables 32980 1727096612.20684: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:03:32 -0400 (0:00:00.611) 0:00:24.133 ****** 32980 1727096612.20716: entering _queue_task() for managed_node2/service 32980 1727096612.21028: worker is 1 (out of 1 available) 32980 1727096612.21044: exiting _queue_task() for managed_node2/service 32980 1727096612.21057: done queuing things up, now waiting for results queue to drain 32980 1727096612.21058: waiting for pending results... 32980 1727096612.21489: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32980 1727096612.21498: in run() - task 0afff68d-5257-457d-ef33-000000000072 32980 1727096612.21518: variable 'ansible_search_path' from source: unknown 32980 1727096612.21526: variable 'ansible_search_path' from source: unknown 32980 1727096612.21571: calling self._execute() 32980 1727096612.21647: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096612.21651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096612.21661: variable 'omit' from source: magic vars 32980 1727096612.22042: variable 'ansible_distribution_major_version' from source: facts 32980 1727096612.22274: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096612.22278: variable 'network_provider' from source: set_fact 32980 1727096612.22281: Evaluated conditional (network_provider == "nm"): True 32980 1727096612.22308: variable '__network_wpa_supplicant_required' from source: role '' defaults 32980 1727096612.22403: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32980 1727096612.22575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096612.24387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096612.24430: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096612.24457: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096612.24489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096612.24509: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096612.24569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096612.24595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096612.24612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096612.24637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096612.24647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096612.24687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096612.24705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096612.24721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096612.24745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096612.24755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096612.24787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096612.24806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096612.24821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096612.24869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096612.25040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096612.25044: variable 'network_connections' from source: task vars 32980 1727096612.25046: variable 'interface' from source: play vars 32980 1727096612.25273: variable 'interface' from source: play vars 32980 1727096612.25277: variable 'vlan_interface' from source: play vars 32980 1727096612.25279: variable 'vlan_interface' from source: play vars 32980 1727096612.25281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096612.29913: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096612.29944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096612.29966: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096612.29994: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096612.30029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32980 1727096612.30045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32980 1727096612.30061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096612.30083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32980 1727096612.30119: variable '__network_wireless_connections_defined' from source: role '' defaults 32980 1727096612.30288: variable 'network_connections' from source: task vars 32980 1727096612.30292: variable 'interface' from source: play vars 32980 1727096612.30342: variable 'interface' from source: play vars 32980 1727096612.30348: variable 'vlan_interface' from source: play vars 32980 1727096612.30395: variable 'vlan_interface' from source: play vars 32980 1727096612.30421: Evaluated conditional (__network_wpa_supplicant_required): False 32980 1727096612.30425: when evaluation is False, skipping this task 32980 1727096612.30435: _execute() done 32980 1727096612.30437: dumping result to json 32980 1727096612.30440: done dumping result, returning 32980 1727096612.30442: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-457d-ef33-000000000072] 32980 1727096612.30444: sending task result for task 0afff68d-5257-457d-ef33-000000000072 32980 1727096612.30520: done sending task result for task 0afff68d-5257-457d-ef33-000000000072 32980 1727096612.30525: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 32980 1727096612.30592: no more pending results, returning what we have 32980 1727096612.30595: results queue empty 32980 1727096612.30596: checking for any_errors_fatal 32980 1727096612.30610: done checking for any_errors_fatal 32980 1727096612.30611: checking for max_fail_percentage 32980 1727096612.30612: done checking for max_fail_percentage 32980 1727096612.30613: checking to see if all hosts have failed and the running result is not ok 32980 1727096612.30614: done checking to see if all hosts have failed 32980 1727096612.30615: getting the remaining hosts for this loop 32980 1727096612.30616: done getting the remaining hosts for this loop 32980 1727096612.30619: getting the next task for host managed_node2 32980 1727096612.30626: done getting next task for host managed_node2 32980 1727096612.30630: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 32980 1727096612.30632: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096612.30650: getting variables 32980 1727096612.30652: in VariableManager get_vars() 32980 1727096612.30692: Calling all_inventory to load vars for managed_node2 32980 1727096612.30694: Calling groups_inventory to load vars for managed_node2 32980 1727096612.30696: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096612.30705: Calling all_plugins_play to load vars for managed_node2 32980 1727096612.30708: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096612.30710: Calling groups_plugins_play to load vars for managed_node2 32980 1727096612.34971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096612.35821: done with get_vars() 32980 1727096612.35838: done getting variables 32980 1727096612.35876: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:03:32 -0400 (0:00:00.151) 0:00:24.285 ****** 32980 1727096612.35898: entering _queue_task() for managed_node2/service 32980 1727096612.36163: worker is 1 (out of 1 available) 32980 1727096612.36180: exiting _queue_task() for managed_node2/service 32980 1727096612.36191: done queuing things up, now waiting for results queue to drain 32980 1727096612.36192: waiting for pending results... 32980 1727096612.36382: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 32980 1727096612.36474: in run() - task 0afff68d-5257-457d-ef33-000000000073 32980 1727096612.36486: variable 'ansible_search_path' from source: unknown 32980 1727096612.36490: variable 'ansible_search_path' from source: unknown 32980 1727096612.36519: calling self._execute() 32980 1727096612.36596: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096612.36600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096612.36609: variable 'omit' from source: magic vars 32980 1727096612.36892: variable 'ansible_distribution_major_version' from source: facts 32980 1727096612.36902: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096612.36987: variable 'network_provider' from source: set_fact 32980 1727096612.36993: Evaluated conditional (network_provider == "initscripts"): False 32980 1727096612.36996: when evaluation is False, skipping this task 32980 1727096612.36999: _execute() done 32980 1727096612.37002: dumping result to json 32980 1727096612.37005: done dumping result, returning 32980 1727096612.37011: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-457d-ef33-000000000073] 32980 1727096612.37015: sending task result for task 0afff68d-5257-457d-ef33-000000000073 32980 1727096612.37111: done sending task result for task 0afff68d-5257-457d-ef33-000000000073 32980 1727096612.37114: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32980 1727096612.37157: no more pending results, returning what we have 32980 1727096612.37160: results queue empty 32980 1727096612.37161: checking for any_errors_fatal 32980 1727096612.37172: done checking for any_errors_fatal 32980 1727096612.37173: checking for max_fail_percentage 32980 1727096612.37175: done checking for max_fail_percentage 32980 1727096612.37175: checking to see if all hosts have failed and the running result is not ok 32980 1727096612.37176: done checking to see if all hosts have failed 32980 1727096612.37177: getting the remaining hosts for this loop 32980 1727096612.37179: done getting the remaining hosts for this loop 32980 1727096612.37182: getting the next task for host managed_node2 32980 1727096612.37190: done getting next task for host managed_node2 32980 1727096612.37193: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32980 1727096612.37196: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096612.37216: getting variables 32980 1727096612.37217: in VariableManager get_vars() 32980 1727096612.37254: Calling all_inventory to load vars for managed_node2 32980 1727096612.37257: Calling groups_inventory to load vars for managed_node2 32980 1727096612.37259: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096612.37269: Calling all_plugins_play to load vars for managed_node2 32980 1727096612.37272: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096612.37275: Calling groups_plugins_play to load vars for managed_node2 32980 1727096612.38106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096612.38967: done with get_vars() 32980 1727096612.38986: done getting variables 32980 1727096612.39030: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:03:32 -0400 (0:00:00.031) 0:00:24.317 ****** 32980 1727096612.39055: entering _queue_task() for managed_node2/copy 32980 1727096612.39294: worker is 1 (out of 1 available) 32980 1727096612.39309: exiting _queue_task() for managed_node2/copy 32980 1727096612.39322: done queuing things up, now waiting for results queue to drain 32980 1727096612.39331: waiting for pending results... 32980 1727096612.39666: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32980 1727096612.39717: in run() - task 0afff68d-5257-457d-ef33-000000000074 32980 1727096612.39736: variable 'ansible_search_path' from source: unknown 32980 1727096612.39743: variable 'ansible_search_path' from source: unknown 32980 1727096612.39787: calling self._execute() 32980 1727096612.39888: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096612.39899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096612.39913: variable 'omit' from source: magic vars 32980 1727096612.40305: variable 'ansible_distribution_major_version' from source: facts 32980 1727096612.40328: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096612.40451: variable 'network_provider' from source: set_fact 32980 1727096612.40462: Evaluated conditional (network_provider == "initscripts"): False 32980 1727096612.40471: when evaluation is False, skipping this task 32980 1727096612.40479: _execute() done 32980 1727096612.40485: dumping result to json 32980 1727096612.40492: done dumping result, returning 32980 1727096612.40504: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-457d-ef33-000000000074] 32980 1727096612.40513: sending task result for task 0afff68d-5257-457d-ef33-000000000074 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 32980 1727096612.40713: no more pending results, returning what we have 32980 1727096612.40717: results queue empty 32980 1727096612.40718: checking for any_errors_fatal 32980 1727096612.40725: done checking for any_errors_fatal 32980 1727096612.40726: checking for max_fail_percentage 32980 1727096612.40728: done checking for max_fail_percentage 32980 1727096612.40729: checking to see if all hosts have failed and the running result is not ok 32980 1727096612.40730: done checking to see if all hosts have failed 32980 1727096612.40731: getting the remaining hosts for this loop 32980 1727096612.40733: done getting the remaining hosts for this loop 32980 1727096612.40736: getting the next task for host managed_node2 32980 1727096612.40745: done getting next task for host managed_node2 32980 1727096612.40749: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32980 1727096612.40752: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096612.40773: getting variables 32980 1727096612.40775: in VariableManager get_vars() 32980 1727096612.40816: Calling all_inventory to load vars for managed_node2 32980 1727096612.40819: Calling groups_inventory to load vars for managed_node2 32980 1727096612.40821: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096612.40834: Calling all_plugins_play to load vars for managed_node2 32980 1727096612.40838: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096612.40841: Calling groups_plugins_play to load vars for managed_node2 32980 1727096612.41580: done sending task result for task 0afff68d-5257-457d-ef33-000000000074 32980 1727096612.41583: WORKER PROCESS EXITING 32980 1727096612.42170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096612.43027: done with get_vars() 32980 1727096612.43042: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:03:32 -0400 (0:00:00.040) 0:00:24.357 ****** 32980 1727096612.43104: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 32980 1727096612.43332: worker is 1 (out of 1 available) 32980 1727096612.43345: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 32980 1727096612.43357: done queuing things up, now waiting for results queue to drain 32980 1727096612.43359: waiting for pending results... 32980 1727096612.43574: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32980 1727096612.43714: in run() - task 0afff68d-5257-457d-ef33-000000000075 32980 1727096612.43733: variable 'ansible_search_path' from source: unknown 32980 1727096612.43739: variable 'ansible_search_path' from source: unknown 32980 1727096612.43780: calling self._execute() 32980 1727096612.43885: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096612.43896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096612.43922: variable 'omit' from source: magic vars 32980 1727096612.44312: variable 'ansible_distribution_major_version' from source: facts 32980 1727096612.44328: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096612.44340: variable 'omit' from source: magic vars 32980 1727096612.44406: variable 'omit' from source: magic vars 32980 1727096612.44577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096612.46650: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096612.46725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096612.46773: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096612.46812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096612.46852: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096612.46961: variable 'network_provider' from source: set_fact 32980 1727096612.47056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096612.47093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096612.47121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096612.47166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096612.47192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096612.47283: variable 'omit' from source: magic vars 32980 1727096612.47381: variable 'omit' from source: magic vars 32980 1727096612.47489: variable 'network_connections' from source: task vars 32980 1727096612.47574: variable 'interface' from source: play vars 32980 1727096612.47577: variable 'interface' from source: play vars 32980 1727096612.47592: variable 'vlan_interface' from source: play vars 32980 1727096612.47657: variable 'vlan_interface' from source: play vars 32980 1727096612.47815: variable 'omit' from source: magic vars 32980 1727096612.47834: variable '__lsr_ansible_managed' from source: task vars 32980 1727096612.47899: variable '__lsr_ansible_managed' from source: task vars 32980 1727096612.48189: Loaded config def from plugin (lookup/template) 32980 1727096612.48202: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 32980 1727096612.48235: File lookup term: get_ansible_managed.j2 32980 1727096612.48243: variable 'ansible_search_path' from source: unknown 32980 1727096612.48474: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 32980 1727096612.48479: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 32980 1727096612.48481: variable 'ansible_search_path' from source: unknown 32980 1727096612.55949: variable 'ansible_managed' from source: unknown 32980 1727096612.56089: variable 'omit' from source: magic vars 32980 1727096612.56124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096612.56160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096612.56186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096612.56209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096612.56224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096612.56259: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096612.56275: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096612.56285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096612.56389: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096612.56392: Set connection var ansible_timeout to 10 32980 1727096612.56395: Set connection var ansible_shell_type to sh 32980 1727096612.56397: Set connection var ansible_connection to ssh 32980 1727096612.56402: Set connection var ansible_shell_executable to /bin/sh 32980 1727096612.56407: Set connection var ansible_pipelining to False 32980 1727096612.56425: variable 'ansible_shell_executable' from source: unknown 32980 1727096612.56428: variable 'ansible_connection' from source: unknown 32980 1727096612.56430: variable 'ansible_module_compression' from source: unknown 32980 1727096612.56433: variable 'ansible_shell_type' from source: unknown 32980 1727096612.56435: variable 'ansible_shell_executable' from source: unknown 32980 1727096612.56437: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096612.56439: variable 'ansible_pipelining' from source: unknown 32980 1727096612.56441: variable 'ansible_timeout' from source: unknown 32980 1727096612.56446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096612.56549: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096612.56561: variable 'omit' from source: magic vars 32980 1727096612.56564: starting attempt loop 32980 1727096612.56569: running the handler 32980 1727096612.56580: _low_level_execute_command(): starting 32980 1727096612.56588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096612.57075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096612.57079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096612.57083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096612.57086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.57131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096612.57135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096612.57137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096612.57184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096612.58862: stdout chunk (state=3): >>>/root <<< 32980 1727096612.58966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096612.58998: stderr chunk (state=3): >>><<< 32980 1727096612.59001: stdout chunk (state=3): >>><<< 32980 1727096612.59019: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096612.59029: _low_level_execute_command(): starting 32980 1727096612.59035: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462 `" && echo ansible-tmp-1727096612.5901918-34103-24998724329462="` echo /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462 `" ) && sleep 0' 32980 1727096612.59428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096612.59460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096612.59463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096612.59465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.59470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096612.59476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096612.59478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.59520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096612.59523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096612.59561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096612.61460: stdout chunk (state=3): >>>ansible-tmp-1727096612.5901918-34103-24998724329462=/root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462 <<< 32980 1727096612.61627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096612.61630: stdout chunk (state=3): >>><<< 32980 1727096612.61633: stderr chunk (state=3): >>><<< 32980 1727096612.61656: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096612.5901918-34103-24998724329462=/root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096612.61703: variable 'ansible_module_compression' from source: unknown 32980 1727096612.61747: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 32980 1727096612.61776: variable 'ansible_facts' from source: unknown 32980 1727096612.61844: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/AnsiballZ_network_connections.py 32980 1727096612.61945: Sending initial data 32980 1727096612.61948: Sent initial data (167 bytes) 32980 1727096612.62341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096612.62372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096612.62381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096612.62384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.62386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096612.62388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.62441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096612.62446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096612.62449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096612.62485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096612.64055: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096612.64114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096612.64164: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp76i9pkf7 /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/AnsiballZ_network_connections.py <<< 32980 1727096612.64172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/AnsiballZ_network_connections.py" <<< 32980 1727096612.64182: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 32980 1727096612.64195: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp76i9pkf7" to remote "/root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/AnsiballZ_network_connections.py" <<< 32980 1727096612.65169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096612.65192: stderr chunk (state=3): >>><<< 32980 1727096612.65197: stdout chunk (state=3): >>><<< 32980 1727096612.65255: done transferring module to remote 32980 1727096612.65258: _low_level_execute_command(): starting 32980 1727096612.65260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/ /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/AnsiballZ_network_connections.py && sleep 0' 32980 1727096612.65740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096612.65777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 32980 1727096612.65782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.65820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096612.65823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096612.65864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096612.67772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096612.67778: stderr chunk (state=3): >>><<< 32980 1727096612.67780: stdout chunk (state=3): >>><<< 32980 1727096612.67783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096612.67785: _low_level_execute_command(): starting 32980 1727096612.67787: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/AnsiballZ_network_connections.py && sleep 0' 32980 1727096612.68278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096612.68282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096612.68298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.68304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 32980 1727096612.68350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096612.68393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096612.68396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096612.68428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.05624: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/e9c06f46-3096-44b6-a493-93c164acfa65: error=unknown <<< 32980 1727096613.07090: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 32980 1727096613.07096: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 32980 1727096613.07112: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/fec20eaf-3c2b-4545-97bb-baae47791113: error=unknown <<< 32980 1727096613.07296: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 32980 1727096613.09297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096613.09301: stdout chunk (state=3): >>><<< 32980 1727096613.09304: stderr chunk (state=3): >>><<< 32980 1727096613.09458: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/e9c06f46-3096-44b6-a493-93c164acfa65: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ygo3_4gw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/fec20eaf-3c2b-4545-97bb-baae47791113: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096613.09462: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'lsr101.90', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096613.09464: _low_level_execute_command(): starting 32980 1727096613.09468: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096612.5901918-34103-24998724329462/ > /dev/null 2>&1 && sleep 0' 32980 1727096613.10037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096613.10049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096613.10064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096613.10150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096613.10191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096613.10209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096613.10229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096613.10301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.12258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096613.12314: stdout chunk (state=3): >>><<< 32980 1727096613.12317: stderr chunk (state=3): >>><<< 32980 1727096613.12357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096613.12396: handler run complete 32980 1727096613.12449: attempt loop complete, returning result 32980 1727096613.12452: _execute() done 32980 1727096613.12455: dumping result to json 32980 1727096613.12457: done dumping result, returning 32980 1727096613.12459: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-457d-ef33-000000000075] 32980 1727096613.12461: sending task result for task 0afff68d-5257-457d-ef33-000000000075 32980 1727096613.12551: done sending task result for task 0afff68d-5257-457d-ef33-000000000075 32980 1727096613.12553: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 32980 1727096613.12788: no more pending results, returning what we have 32980 1727096613.12791: results queue empty 32980 1727096613.12792: checking for any_errors_fatal 32980 1727096613.12800: done checking for any_errors_fatal 32980 1727096613.12801: checking for max_fail_percentage 32980 1727096613.12802: done checking for max_fail_percentage 32980 1727096613.12803: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.12804: done checking to see if all hosts have failed 32980 1727096613.12805: getting the remaining hosts for this loop 32980 1727096613.12806: done getting the remaining hosts for this loop 32980 1727096613.12810: getting the next task for host managed_node2 32980 1727096613.12818: done getting next task for host managed_node2 32980 1727096613.12821: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 32980 1727096613.12824: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.12835: getting variables 32980 1727096613.12837: in VariableManager get_vars() 32980 1727096613.12946: Calling all_inventory to load vars for managed_node2 32980 1727096613.12949: Calling groups_inventory to load vars for managed_node2 32980 1727096613.12951: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.12961: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.12964: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.12967: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.14050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.16151: done with get_vars() 32980 1727096613.16186: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:03:33 -0400 (0:00:00.731) 0:00:25.090 ****** 32980 1727096613.16352: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 32980 1727096613.16889: worker is 1 (out of 1 available) 32980 1727096613.16903: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 32980 1727096613.16914: done queuing things up, now waiting for results queue to drain 32980 1727096613.16915: waiting for pending results... 32980 1727096613.17810: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 32980 1727096613.17814: in run() - task 0afff68d-5257-457d-ef33-000000000076 32980 1727096613.17817: variable 'ansible_search_path' from source: unknown 32980 1727096613.17819: variable 'ansible_search_path' from source: unknown 32980 1727096613.17884: calling self._execute() 32980 1727096613.18006: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.18015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.18024: variable 'omit' from source: magic vars 32980 1727096613.18325: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.18336: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.18418: variable 'network_state' from source: role '' defaults 32980 1727096613.18431: Evaluated conditional (network_state != {}): False 32980 1727096613.18434: when evaluation is False, skipping this task 32980 1727096613.18439: _execute() done 32980 1727096613.18441: dumping result to json 32980 1727096613.18444: done dumping result, returning 32980 1727096613.18447: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-457d-ef33-000000000076] 32980 1727096613.18449: sending task result for task 0afff68d-5257-457d-ef33-000000000076 32980 1727096613.18537: done sending task result for task 0afff68d-5257-457d-ef33-000000000076 32980 1727096613.18540: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32980 1727096613.18591: no more pending results, returning what we have 32980 1727096613.18594: results queue empty 32980 1727096613.18595: checking for any_errors_fatal 32980 1727096613.18603: done checking for any_errors_fatal 32980 1727096613.18604: checking for max_fail_percentage 32980 1727096613.18606: done checking for max_fail_percentage 32980 1727096613.18606: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.18607: done checking to see if all hosts have failed 32980 1727096613.18608: getting the remaining hosts for this loop 32980 1727096613.18609: done getting the remaining hosts for this loop 32980 1727096613.18613: getting the next task for host managed_node2 32980 1727096613.18621: done getting next task for host managed_node2 32980 1727096613.18624: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32980 1727096613.18628: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.18646: getting variables 32980 1727096613.18648: in VariableManager get_vars() 32980 1727096613.18691: Calling all_inventory to load vars for managed_node2 32980 1727096613.18694: Calling groups_inventory to load vars for managed_node2 32980 1727096613.18696: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.18706: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.18708: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.18711: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.19512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.21121: done with get_vars() 32980 1727096613.21147: done getting variables 32980 1727096613.21219: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:03:33 -0400 (0:00:00.048) 0:00:25.139 ****** 32980 1727096613.21253: entering _queue_task() for managed_node2/debug 32980 1727096613.21757: worker is 1 (out of 1 available) 32980 1727096613.21895: exiting _queue_task() for managed_node2/debug 32980 1727096613.21907: done queuing things up, now waiting for results queue to drain 32980 1727096613.21909: waiting for pending results... 32980 1727096613.22286: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32980 1727096613.22292: in run() - task 0afff68d-5257-457d-ef33-000000000077 32980 1727096613.22295: variable 'ansible_search_path' from source: unknown 32980 1727096613.22298: variable 'ansible_search_path' from source: unknown 32980 1727096613.22304: calling self._execute() 32980 1727096613.22408: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.22411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.22424: variable 'omit' from source: magic vars 32980 1727096613.22950: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.22962: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.22971: variable 'omit' from source: magic vars 32980 1727096613.23041: variable 'omit' from source: magic vars 32980 1727096613.23084: variable 'omit' from source: magic vars 32980 1727096613.23135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096613.23170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096613.23275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096613.23278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096613.23281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096613.23284: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096613.23286: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.23288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.23376: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096613.23380: Set connection var ansible_timeout to 10 32980 1727096613.23382: Set connection var ansible_shell_type to sh 32980 1727096613.23385: Set connection var ansible_connection to ssh 32980 1727096613.23391: Set connection var ansible_shell_executable to /bin/sh 32980 1727096613.23396: Set connection var ansible_pipelining to False 32980 1727096613.23429: variable 'ansible_shell_executable' from source: unknown 32980 1727096613.23432: variable 'ansible_connection' from source: unknown 32980 1727096613.23435: variable 'ansible_module_compression' from source: unknown 32980 1727096613.23437: variable 'ansible_shell_type' from source: unknown 32980 1727096613.23440: variable 'ansible_shell_executable' from source: unknown 32980 1727096613.23442: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.23444: variable 'ansible_pipelining' from source: unknown 32980 1727096613.23446: variable 'ansible_timeout' from source: unknown 32980 1727096613.23450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.23599: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096613.23772: variable 'omit' from source: magic vars 32980 1727096613.23778: starting attempt loop 32980 1727096613.23782: running the handler 32980 1727096613.23784: variable '__network_connections_result' from source: set_fact 32980 1727096613.23810: handler run complete 32980 1727096613.23827: attempt loop complete, returning result 32980 1727096613.23830: _execute() done 32980 1727096613.23833: dumping result to json 32980 1727096613.23840: done dumping result, returning 32980 1727096613.23860: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-457d-ef33-000000000077] 32980 1727096613.23865: sending task result for task 0afff68d-5257-457d-ef33-000000000077 32980 1727096613.23954: done sending task result for task 0afff68d-5257-457d-ef33-000000000077 32980 1727096613.23957: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 32980 1727096613.24024: no more pending results, returning what we have 32980 1727096613.24028: results queue empty 32980 1727096613.24029: checking for any_errors_fatal 32980 1727096613.24036: done checking for any_errors_fatal 32980 1727096613.24037: checking for max_fail_percentage 32980 1727096613.24038: done checking for max_fail_percentage 32980 1727096613.24039: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.24040: done checking to see if all hosts have failed 32980 1727096613.24041: getting the remaining hosts for this loop 32980 1727096613.24042: done getting the remaining hosts for this loop 32980 1727096613.24046: getting the next task for host managed_node2 32980 1727096613.24055: done getting next task for host managed_node2 32980 1727096613.24059: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32980 1727096613.24062: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.24077: getting variables 32980 1727096613.24079: in VariableManager get_vars() 32980 1727096613.24119: Calling all_inventory to load vars for managed_node2 32980 1727096613.24122: Calling groups_inventory to load vars for managed_node2 32980 1727096613.24124: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.24134: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.24137: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.24140: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.26208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.27916: done with get_vars() 32980 1727096613.27937: done getting variables 32980 1727096613.28009: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:03:33 -0400 (0:00:00.067) 0:00:25.207 ****** 32980 1727096613.28042: entering _queue_task() for managed_node2/debug 32980 1727096613.28388: worker is 1 (out of 1 available) 32980 1727096613.28516: exiting _queue_task() for managed_node2/debug 32980 1727096613.28527: done queuing things up, now waiting for results queue to drain 32980 1727096613.28528: waiting for pending results... 32980 1727096613.28886: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32980 1727096613.28891: in run() - task 0afff68d-5257-457d-ef33-000000000078 32980 1727096613.28894: variable 'ansible_search_path' from source: unknown 32980 1727096613.28896: variable 'ansible_search_path' from source: unknown 32980 1727096613.28899: calling self._execute() 32980 1727096613.28981: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.28985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.28997: variable 'omit' from source: magic vars 32980 1727096613.29378: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.29454: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.29458: variable 'omit' from source: magic vars 32980 1727096613.29460: variable 'omit' from source: magic vars 32980 1727096613.29532: variable 'omit' from source: magic vars 32980 1727096613.29536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096613.29563: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096613.29676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096613.29680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096613.29683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096613.29685: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096613.29688: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.29690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.29778: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096613.29782: Set connection var ansible_timeout to 10 32980 1727096613.29784: Set connection var ansible_shell_type to sh 32980 1727096613.29786: Set connection var ansible_connection to ssh 32980 1727096613.29788: Set connection var ansible_shell_executable to /bin/sh 32980 1727096613.29793: Set connection var ansible_pipelining to False 32980 1727096613.29816: variable 'ansible_shell_executable' from source: unknown 32980 1727096613.29818: variable 'ansible_connection' from source: unknown 32980 1727096613.29822: variable 'ansible_module_compression' from source: unknown 32980 1727096613.29831: variable 'ansible_shell_type' from source: unknown 32980 1727096613.29833: variable 'ansible_shell_executable' from source: unknown 32980 1727096613.29836: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.29840: variable 'ansible_pipelining' from source: unknown 32980 1727096613.29843: variable 'ansible_timeout' from source: unknown 32980 1727096613.29845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.29990: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096613.29999: variable 'omit' from source: magic vars 32980 1727096613.30005: starting attempt loop 32980 1727096613.30007: running the handler 32980 1727096613.30175: variable '__network_connections_result' from source: set_fact 32980 1727096613.30179: variable '__network_connections_result' from source: set_fact 32980 1727096613.30252: handler run complete 32980 1727096613.30284: attempt loop complete, returning result 32980 1727096613.30287: _execute() done 32980 1727096613.30290: dumping result to json 32980 1727096613.30292: done dumping result, returning 32980 1727096613.30302: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-457d-ef33-000000000078] 32980 1727096613.30305: sending task result for task 0afff68d-5257-457d-ef33-000000000078 32980 1727096613.30401: done sending task result for task 0afff68d-5257-457d-ef33-000000000078 32980 1727096613.30404: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 32980 1727096613.30491: no more pending results, returning what we have 32980 1727096613.30494: results queue empty 32980 1727096613.30495: checking for any_errors_fatal 32980 1727096613.30501: done checking for any_errors_fatal 32980 1727096613.30502: checking for max_fail_percentage 32980 1727096613.30504: done checking for max_fail_percentage 32980 1727096613.30505: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.30506: done checking to see if all hosts have failed 32980 1727096613.30507: getting the remaining hosts for this loop 32980 1727096613.30508: done getting the remaining hosts for this loop 32980 1727096613.30512: getting the next task for host managed_node2 32980 1727096613.30520: done getting next task for host managed_node2 32980 1727096613.30523: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32980 1727096613.30527: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.30538: getting variables 32980 1727096613.30540: in VariableManager get_vars() 32980 1727096613.30783: Calling all_inventory to load vars for managed_node2 32980 1727096613.30786: Calling groups_inventory to load vars for managed_node2 32980 1727096613.30788: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.30797: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.30799: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.30802: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.32121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.33729: done with get_vars() 32980 1727096613.33750: done getting variables 32980 1727096613.33814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:03:33 -0400 (0:00:00.058) 0:00:25.265 ****** 32980 1727096613.33853: entering _queue_task() for managed_node2/debug 32980 1727096613.34276: worker is 1 (out of 1 available) 32980 1727096613.34288: exiting _queue_task() for managed_node2/debug 32980 1727096613.34300: done queuing things up, now waiting for results queue to drain 32980 1727096613.34301: waiting for pending results... 32980 1727096613.34531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32980 1727096613.34629: in run() - task 0afff68d-5257-457d-ef33-000000000079 32980 1727096613.34633: variable 'ansible_search_path' from source: unknown 32980 1727096613.34637: variable 'ansible_search_path' from source: unknown 32980 1727096613.34708: calling self._execute() 32980 1727096613.34779: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.34783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.34852: variable 'omit' from source: magic vars 32980 1727096613.35181: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.35197: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.35329: variable 'network_state' from source: role '' defaults 32980 1727096613.35344: Evaluated conditional (network_state != {}): False 32980 1727096613.35348: when evaluation is False, skipping this task 32980 1727096613.35350: _execute() done 32980 1727096613.35353: dumping result to json 32980 1727096613.35355: done dumping result, returning 32980 1727096613.35392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-457d-ef33-000000000079] 32980 1727096613.35395: sending task result for task 0afff68d-5257-457d-ef33-000000000079 32980 1727096613.35458: done sending task result for task 0afff68d-5257-457d-ef33-000000000079 32980 1727096613.35461: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 32980 1727096613.35508: no more pending results, returning what we have 32980 1727096613.35512: results queue empty 32980 1727096613.35513: checking for any_errors_fatal 32980 1727096613.35523: done checking for any_errors_fatal 32980 1727096613.35524: checking for max_fail_percentage 32980 1727096613.35527: done checking for max_fail_percentage 32980 1727096613.35528: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.35529: done checking to see if all hosts have failed 32980 1727096613.35529: getting the remaining hosts for this loop 32980 1727096613.35531: done getting the remaining hosts for this loop 32980 1727096613.35535: getting the next task for host managed_node2 32980 1727096613.35544: done getting next task for host managed_node2 32980 1727096613.35550: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 32980 1727096613.35554: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.35574: getting variables 32980 1727096613.35577: in VariableManager get_vars() 32980 1727096613.35618: Calling all_inventory to load vars for managed_node2 32980 1727096613.35621: Calling groups_inventory to load vars for managed_node2 32980 1727096613.35624: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.35636: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.35640: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.35643: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.37290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.38911: done with get_vars() 32980 1727096613.38937: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:03:33 -0400 (0:00:00.051) 0:00:25.316 ****** 32980 1727096613.39032: entering _queue_task() for managed_node2/ping 32980 1727096613.39296: worker is 1 (out of 1 available) 32980 1727096613.39307: exiting _queue_task() for managed_node2/ping 32980 1727096613.39318: done queuing things up, now waiting for results queue to drain 32980 1727096613.39320: waiting for pending results... 32980 1727096613.39762: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 32980 1727096613.39770: in run() - task 0afff68d-5257-457d-ef33-00000000007a 32980 1727096613.39777: variable 'ansible_search_path' from source: unknown 32980 1727096613.39780: variable 'ansible_search_path' from source: unknown 32980 1727096613.39782: calling self._execute() 32980 1727096613.39848: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.39852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.39866: variable 'omit' from source: magic vars 32980 1727096613.40295: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.40299: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.40302: variable 'omit' from source: magic vars 32980 1727096613.40362: variable 'omit' from source: magic vars 32980 1727096613.40399: variable 'omit' from source: magic vars 32980 1727096613.40442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096613.40491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096613.40507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096613.40526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096613.40537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096613.40578: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096613.40582: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.40584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.40696: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096613.40699: Set connection var ansible_timeout to 10 32980 1727096613.40706: Set connection var ansible_shell_type to sh 32980 1727096613.40709: Set connection var ansible_connection to ssh 32980 1727096613.40711: Set connection var ansible_shell_executable to /bin/sh 32980 1727096613.40717: Set connection var ansible_pipelining to False 32980 1727096613.40741: variable 'ansible_shell_executable' from source: unknown 32980 1727096613.40744: variable 'ansible_connection' from source: unknown 32980 1727096613.40747: variable 'ansible_module_compression' from source: unknown 32980 1727096613.40749: variable 'ansible_shell_type' from source: unknown 32980 1727096613.40752: variable 'ansible_shell_executable' from source: unknown 32980 1727096613.40754: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.40771: variable 'ansible_pipelining' from source: unknown 32980 1727096613.40778: variable 'ansible_timeout' from source: unknown 32980 1727096613.40781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.41036: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32980 1727096613.41041: variable 'omit' from source: magic vars 32980 1727096613.41043: starting attempt loop 32980 1727096613.41046: running the handler 32980 1727096613.41048: _low_level_execute_command(): starting 32980 1727096613.41051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096613.41979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096613.41995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096613.42029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.43721: stdout chunk (state=3): >>>/root <<< 32980 1727096613.43882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096613.43885: stdout chunk (state=3): >>><<< 32980 1727096613.43888: stderr chunk (state=3): >>><<< 32980 1727096613.43907: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096613.43926: _low_level_execute_command(): starting 32980 1727096613.43937: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554 `" && echo ansible-tmp-1727096613.4391356-34144-142368568411554="` echo /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554 `" ) && sleep 0' 32980 1727096613.44571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096613.44590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096613.44607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096613.44631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096613.44652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096613.44687: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096613.44749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096613.44799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096613.44816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096613.44849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096613.44907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.46796: stdout chunk (state=3): >>>ansible-tmp-1727096613.4391356-34144-142368568411554=/root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554 <<< 32980 1727096613.46948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096613.46951: stdout chunk (state=3): >>><<< 32980 1727096613.46954: stderr chunk (state=3): >>><<< 32980 1727096613.46975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096613.4391356-34144-142368568411554=/root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096613.47175: variable 'ansible_module_compression' from source: unknown 32980 1727096613.47179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 32980 1727096613.47181: variable 'ansible_facts' from source: unknown 32980 1727096613.47195: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/AnsiballZ_ping.py 32980 1727096613.47396: Sending initial data 32980 1727096613.47404: Sent initial data (153 bytes) 32980 1727096613.47978: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096613.48067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096613.48089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096613.48122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096613.48138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096613.48159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096613.48242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.49793: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096613.49835: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096613.49894: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpr0tovyh7 /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/AnsiballZ_ping.py <<< 32980 1727096613.49926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/AnsiballZ_ping.py" <<< 32980 1727096613.49984: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpr0tovyh7" to remote "/root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/AnsiballZ_ping.py" <<< 32980 1727096613.50722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096613.50759: stderr chunk (state=3): >>><<< 32980 1727096613.50771: stdout chunk (state=3): >>><<< 32980 1727096613.50799: done transferring module to remote 32980 1727096613.50831: _low_level_execute_command(): starting 32980 1727096613.50834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/ /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/AnsiballZ_ping.py && sleep 0' 32980 1727096613.51456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096613.51506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096613.51519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096613.51604: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096613.51617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096613.51637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096613.51652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096613.51679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096613.51736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.53481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096613.53500: stderr chunk (state=3): >>><<< 32980 1727096613.53509: stdout chunk (state=3): >>><<< 32980 1727096613.53524: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096613.53527: _low_level_execute_command(): starting 32980 1727096613.53533: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/AnsiballZ_ping.py && sleep 0' 32980 1727096613.54172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096613.54178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096613.54181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096613.54184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096613.54186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096613.54188: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096613.54190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096613.54192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096613.54194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096613.54195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096613.54197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096613.54230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096613.54266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096613.54279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096613.54296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096613.54358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.69344: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 32980 1727096613.70783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096613.70787: stdout chunk (state=3): >>><<< 32980 1727096613.70789: stderr chunk (state=3): >>><<< 32980 1727096613.70792: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096613.70795: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096613.70797: _low_level_execute_command(): starting 32980 1727096613.70800: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096613.4391356-34144-142368568411554/ > /dev/null 2>&1 && sleep 0' 32980 1727096613.71338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096613.71346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096613.71358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096613.71380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096613.71470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096613.71539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096613.71582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096613.73672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096613.73677: stdout chunk (state=3): >>><<< 32980 1727096613.73679: stderr chunk (state=3): >>><<< 32980 1727096613.73681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096613.73683: handler run complete 32980 1727096613.73685: attempt loop complete, returning result 32980 1727096613.73687: _execute() done 32980 1727096613.73688: dumping result to json 32980 1727096613.73690: done dumping result, returning 32980 1727096613.73692: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-457d-ef33-00000000007a] 32980 1727096613.73693: sending task result for task 0afff68d-5257-457d-ef33-00000000007a 32980 1727096613.73753: done sending task result for task 0afff68d-5257-457d-ef33-00000000007a 32980 1727096613.73756: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 32980 1727096613.73825: no more pending results, returning what we have 32980 1727096613.73829: results queue empty 32980 1727096613.73830: checking for any_errors_fatal 32980 1727096613.73836: done checking for any_errors_fatal 32980 1727096613.73837: checking for max_fail_percentage 32980 1727096613.73839: done checking for max_fail_percentage 32980 1727096613.73839: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.73840: done checking to see if all hosts have failed 32980 1727096613.73841: getting the remaining hosts for this loop 32980 1727096613.73842: done getting the remaining hosts for this loop 32980 1727096613.73846: getting the next task for host managed_node2 32980 1727096613.73857: done getting next task for host managed_node2 32980 1727096613.73859: ^ task is: TASK: meta (role_complete) 32980 1727096613.73862: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.73878: getting variables 32980 1727096613.73880: in VariableManager get_vars() 32980 1727096613.73919: Calling all_inventory to load vars for managed_node2 32980 1727096613.73922: Calling groups_inventory to load vars for managed_node2 32980 1727096613.73924: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.73933: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.73936: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.73938: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.75363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.76981: done with get_vars() 32980 1727096613.77009: done getting variables 32980 1727096613.77095: done queuing things up, now waiting for results queue to drain 32980 1727096613.77098: results queue empty 32980 1727096613.77099: checking for any_errors_fatal 32980 1727096613.77101: done checking for any_errors_fatal 32980 1727096613.77107: checking for max_fail_percentage 32980 1727096613.77108: done checking for max_fail_percentage 32980 1727096613.77109: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.77110: done checking to see if all hosts have failed 32980 1727096613.77110: getting the remaining hosts for this loop 32980 1727096613.77111: done getting the remaining hosts for this loop 32980 1727096613.77114: getting the next task for host managed_node2 32980 1727096613.77119: done getting next task for host managed_node2 32980 1727096613.77121: ^ task is: TASK: Include the task 'manage_test_interface.yml' 32980 1727096613.77123: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.77125: getting variables 32980 1727096613.77126: in VariableManager get_vars() 32980 1727096613.77141: Calling all_inventory to load vars for managed_node2 32980 1727096613.77144: Calling groups_inventory to load vars for managed_node2 32980 1727096613.77145: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.77150: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.77152: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.77155: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.78402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.79969: done with get_vars() 32980 1727096613.79988: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:73 Monday 23 September 2024 09:03:33 -0400 (0:00:00.410) 0:00:25.727 ****** 32980 1727096613.80054: entering _queue_task() for managed_node2/include_tasks 32980 1727096613.80600: worker is 1 (out of 1 available) 32980 1727096613.80609: exiting _queue_task() for managed_node2/include_tasks 32980 1727096613.80619: done queuing things up, now waiting for results queue to drain 32980 1727096613.80620: waiting for pending results... 32980 1727096613.80726: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 32980 1727096613.80816: in run() - task 0afff68d-5257-457d-ef33-0000000000aa 32980 1727096613.80830: variable 'ansible_search_path' from source: unknown 32980 1727096613.80876: calling self._execute() 32980 1727096613.80984: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.80988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.80990: variable 'omit' from source: magic vars 32980 1727096613.81366: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.81380: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.81384: _execute() done 32980 1727096613.81396: dumping result to json 32980 1727096613.81399: done dumping result, returning 32980 1727096613.81409: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-457d-ef33-0000000000aa] 32980 1727096613.81412: sending task result for task 0afff68d-5257-457d-ef33-0000000000aa 32980 1727096613.81533: no more pending results, returning what we have 32980 1727096613.81538: in VariableManager get_vars() 32980 1727096613.81587: Calling all_inventory to load vars for managed_node2 32980 1727096613.81590: Calling groups_inventory to load vars for managed_node2 32980 1727096613.81593: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.81612: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.81616: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.81621: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.82145: done sending task result for task 0afff68d-5257-457d-ef33-0000000000aa 32980 1727096613.82149: WORKER PROCESS EXITING 32980 1727096613.83282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.85105: done with get_vars() 32980 1727096613.85135: variable 'ansible_search_path' from source: unknown 32980 1727096613.85147: we have included files to process 32980 1727096613.85148: generating all_blocks data 32980 1727096613.85149: done generating all_blocks data 32980 1727096613.85154: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32980 1727096613.85156: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32980 1727096613.85158: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32980 1727096613.85614: in VariableManager get_vars() 32980 1727096613.85636: done with get_vars() 32980 1727096613.86612: done processing included file 32980 1727096613.86614: iterating over new_blocks loaded from include file 32980 1727096613.86615: in VariableManager get_vars() 32980 1727096613.86631: done with get_vars() 32980 1727096613.86633: filtering new block on tags 32980 1727096613.86663: done filtering new block on tags 32980 1727096613.86666: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 32980 1727096613.86675: extending task lists for all hosts with included blocks 32980 1727096613.89503: done extending task lists 32980 1727096613.89505: done processing included files 32980 1727096613.89506: results queue empty 32980 1727096613.89506: checking for any_errors_fatal 32980 1727096613.89508: done checking for any_errors_fatal 32980 1727096613.89508: checking for max_fail_percentage 32980 1727096613.89510: done checking for max_fail_percentage 32980 1727096613.89510: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.89511: done checking to see if all hosts have failed 32980 1727096613.89512: getting the remaining hosts for this loop 32980 1727096613.89513: done getting the remaining hosts for this loop 32980 1727096613.89515: getting the next task for host managed_node2 32980 1727096613.89519: done getting next task for host managed_node2 32980 1727096613.89521: ^ task is: TASK: Ensure state in ["present", "absent"] 32980 1727096613.89523: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.89526: getting variables 32980 1727096613.89526: in VariableManager get_vars() 32980 1727096613.89539: Calling all_inventory to load vars for managed_node2 32980 1727096613.89541: Calling groups_inventory to load vars for managed_node2 32980 1727096613.89543: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.89547: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.89550: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.89553: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.90634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.92182: done with get_vars() 32980 1727096613.92201: done getting variables 32980 1727096613.92240: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 09:03:33 -0400 (0:00:00.122) 0:00:25.849 ****** 32980 1727096613.92270: entering _queue_task() for managed_node2/fail 32980 1727096613.92569: worker is 1 (out of 1 available) 32980 1727096613.92584: exiting _queue_task() for managed_node2/fail 32980 1727096613.92595: done queuing things up, now waiting for results queue to drain 32980 1727096613.92597: waiting for pending results... 32980 1727096613.92985: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 32980 1727096613.92990: in run() - task 0afff68d-5257-457d-ef33-00000000093c 32980 1727096613.92993: variable 'ansible_search_path' from source: unknown 32980 1727096613.92996: variable 'ansible_search_path' from source: unknown 32980 1727096613.92999: calling self._execute() 32980 1727096613.93018: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.93023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.93034: variable 'omit' from source: magic vars 32980 1727096613.93383: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.93394: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.93533: variable 'state' from source: include params 32980 1727096613.93536: Evaluated conditional (state not in ["present", "absent"]): False 32980 1727096613.93539: when evaluation is False, skipping this task 32980 1727096613.93544: _execute() done 32980 1727096613.93550: dumping result to json 32980 1727096613.93553: done dumping result, returning 32980 1727096613.93556: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-457d-ef33-00000000093c] 32980 1727096613.93558: sending task result for task 0afff68d-5257-457d-ef33-00000000093c 32980 1727096613.93639: done sending task result for task 0afff68d-5257-457d-ef33-00000000093c 32980 1727096613.93642: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 32980 1727096613.93702: no more pending results, returning what we have 32980 1727096613.93706: results queue empty 32980 1727096613.93707: checking for any_errors_fatal 32980 1727096613.93708: done checking for any_errors_fatal 32980 1727096613.93709: checking for max_fail_percentage 32980 1727096613.93711: done checking for max_fail_percentage 32980 1727096613.93712: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.93713: done checking to see if all hosts have failed 32980 1727096613.93713: getting the remaining hosts for this loop 32980 1727096613.93715: done getting the remaining hosts for this loop 32980 1727096613.93718: getting the next task for host managed_node2 32980 1727096613.93725: done getting next task for host managed_node2 32980 1727096613.93727: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 32980 1727096613.93730: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.93733: getting variables 32980 1727096613.93735: in VariableManager get_vars() 32980 1727096613.93766: Calling all_inventory to load vars for managed_node2 32980 1727096613.93771: Calling groups_inventory to load vars for managed_node2 32980 1727096613.93775: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.93784: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.93786: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.93789: Calling groups_plugins_play to load vars for managed_node2 32980 1727096613.95137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096613.96701: done with get_vars() 32980 1727096613.96721: done getting variables 32980 1727096613.96779: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 09:03:33 -0400 (0:00:00.045) 0:00:25.894 ****** 32980 1727096613.96807: entering _queue_task() for managed_node2/fail 32980 1727096613.97060: worker is 1 (out of 1 available) 32980 1727096613.97178: exiting _queue_task() for managed_node2/fail 32980 1727096613.97190: done queuing things up, now waiting for results queue to drain 32980 1727096613.97191: waiting for pending results... 32980 1727096613.97426: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 32980 1727096613.97527: in run() - task 0afff68d-5257-457d-ef33-00000000093d 32980 1727096613.97530: variable 'ansible_search_path' from source: unknown 32980 1727096613.97533: variable 'ansible_search_path' from source: unknown 32980 1727096613.97562: calling self._execute() 32980 1727096613.97875: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096613.97880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096613.97882: variable 'omit' from source: magic vars 32980 1727096613.98072: variable 'ansible_distribution_major_version' from source: facts 32980 1727096613.98091: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096613.98244: variable 'type' from source: play vars 32980 1727096613.98257: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 32980 1727096613.98264: when evaluation is False, skipping this task 32980 1727096613.98276: _execute() done 32980 1727096613.98284: dumping result to json 32980 1727096613.98292: done dumping result, returning 32980 1727096613.98300: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-457d-ef33-00000000093d] 32980 1727096613.98309: sending task result for task 0afff68d-5257-457d-ef33-00000000093d 32980 1727096613.98412: done sending task result for task 0afff68d-5257-457d-ef33-00000000093d 32980 1727096613.98419: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 32980 1727096613.98471: no more pending results, returning what we have 32980 1727096613.98477: results queue empty 32980 1727096613.98478: checking for any_errors_fatal 32980 1727096613.98484: done checking for any_errors_fatal 32980 1727096613.98485: checking for max_fail_percentage 32980 1727096613.98487: done checking for max_fail_percentage 32980 1727096613.98488: checking to see if all hosts have failed and the running result is not ok 32980 1727096613.98489: done checking to see if all hosts have failed 32980 1727096613.98489: getting the remaining hosts for this loop 32980 1727096613.98491: done getting the remaining hosts for this loop 32980 1727096613.98494: getting the next task for host managed_node2 32980 1727096613.98503: done getting next task for host managed_node2 32980 1727096613.98505: ^ task is: TASK: Include the task 'show_interfaces.yml' 32980 1727096613.98509: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096613.98513: getting variables 32980 1727096613.98515: in VariableManager get_vars() 32980 1727096613.98553: Calling all_inventory to load vars for managed_node2 32980 1727096613.98556: Calling groups_inventory to load vars for managed_node2 32980 1727096613.98559: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096613.98575: Calling all_plugins_play to load vars for managed_node2 32980 1727096613.98579: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096613.98582: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.00092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.01812: done with get_vars() 32980 1727096614.01832: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 09:03:34 -0400 (0:00:00.051) 0:00:25.945 ****** 32980 1727096614.01927: entering _queue_task() for managed_node2/include_tasks 32980 1727096614.02376: worker is 1 (out of 1 available) 32980 1727096614.02387: exiting _queue_task() for managed_node2/include_tasks 32980 1727096614.02398: done queuing things up, now waiting for results queue to drain 32980 1727096614.02399: waiting for pending results... 32980 1727096614.02478: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 32980 1727096614.02626: in run() - task 0afff68d-5257-457d-ef33-00000000093e 32980 1727096614.02630: variable 'ansible_search_path' from source: unknown 32980 1727096614.02633: variable 'ansible_search_path' from source: unknown 32980 1727096614.02663: calling self._execute() 32980 1727096614.02769: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.02844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.02847: variable 'omit' from source: magic vars 32980 1727096614.03179: variable 'ansible_distribution_major_version' from source: facts 32980 1727096614.03197: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096614.03208: _execute() done 32980 1727096614.03216: dumping result to json 32980 1727096614.03225: done dumping result, returning 32980 1727096614.03235: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-457d-ef33-00000000093e] 32980 1727096614.03245: sending task result for task 0afff68d-5257-457d-ef33-00000000093e 32980 1727096614.03404: no more pending results, returning what we have 32980 1727096614.03409: in VariableManager get_vars() 32980 1727096614.03457: Calling all_inventory to load vars for managed_node2 32980 1727096614.03461: Calling groups_inventory to load vars for managed_node2 32980 1727096614.03463: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096614.03481: Calling all_plugins_play to load vars for managed_node2 32980 1727096614.03484: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096614.03487: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.04181: done sending task result for task 0afff68d-5257-457d-ef33-00000000093e 32980 1727096614.04184: WORKER PROCESS EXITING 32980 1727096614.04887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.06416: done with get_vars() 32980 1727096614.06433: variable 'ansible_search_path' from source: unknown 32980 1727096614.06435: variable 'ansible_search_path' from source: unknown 32980 1727096614.06470: we have included files to process 32980 1727096614.06472: generating all_blocks data 32980 1727096614.06476: done generating all_blocks data 32980 1727096614.06481: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096614.06482: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096614.06485: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32980 1727096614.06583: in VariableManager get_vars() 32980 1727096614.06605: done with get_vars() 32980 1727096614.06716: done processing included file 32980 1727096614.06719: iterating over new_blocks loaded from include file 32980 1727096614.06720: in VariableManager get_vars() 32980 1727096614.06740: done with get_vars() 32980 1727096614.06742: filtering new block on tags 32980 1727096614.06759: done filtering new block on tags 32980 1727096614.06762: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 32980 1727096614.06766: extending task lists for all hosts with included blocks 32980 1727096614.07137: done extending task lists 32980 1727096614.07139: done processing included files 32980 1727096614.07140: results queue empty 32980 1727096614.07140: checking for any_errors_fatal 32980 1727096614.07143: done checking for any_errors_fatal 32980 1727096614.07144: checking for max_fail_percentage 32980 1727096614.07145: done checking for max_fail_percentage 32980 1727096614.07146: checking to see if all hosts have failed and the running result is not ok 32980 1727096614.07146: done checking to see if all hosts have failed 32980 1727096614.07147: getting the remaining hosts for this loop 32980 1727096614.07148: done getting the remaining hosts for this loop 32980 1727096614.07150: getting the next task for host managed_node2 32980 1727096614.07154: done getting next task for host managed_node2 32980 1727096614.07156: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32980 1727096614.07159: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096614.07162: getting variables 32980 1727096614.07163: in VariableManager get_vars() 32980 1727096614.07179: Calling all_inventory to load vars for managed_node2 32980 1727096614.07181: Calling groups_inventory to load vars for managed_node2 32980 1727096614.07183: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096614.07187: Calling all_plugins_play to load vars for managed_node2 32980 1727096614.07190: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096614.07192: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.08397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.09958: done with get_vars() 32980 1727096614.09981: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:03:34 -0400 (0:00:00.081) 0:00:26.027 ****** 32980 1727096614.10047: entering _queue_task() for managed_node2/include_tasks 32980 1727096614.10290: worker is 1 (out of 1 available) 32980 1727096614.10300: exiting _queue_task() for managed_node2/include_tasks 32980 1727096614.10311: done queuing things up, now waiting for results queue to drain 32980 1727096614.10312: waiting for pending results... 32980 1727096614.10556: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 32980 1727096614.10658: in run() - task 0afff68d-5257-457d-ef33-000000000aa0 32980 1727096614.10685: variable 'ansible_search_path' from source: unknown 32980 1727096614.10692: variable 'ansible_search_path' from source: unknown 32980 1727096614.10732: calling self._execute() 32980 1727096614.10836: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.10847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.10863: variable 'omit' from source: magic vars 32980 1727096614.11257: variable 'ansible_distribution_major_version' from source: facts 32980 1727096614.11280: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096614.11292: _execute() done 32980 1727096614.11300: dumping result to json 32980 1727096614.11307: done dumping result, returning 32980 1727096614.11317: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-457d-ef33-000000000aa0] 32980 1727096614.11334: sending task result for task 0afff68d-5257-457d-ef33-000000000aa0 32980 1727096614.11456: no more pending results, returning what we have 32980 1727096614.11462: in VariableManager get_vars() 32980 1727096614.11513: Calling all_inventory to load vars for managed_node2 32980 1727096614.11516: Calling groups_inventory to load vars for managed_node2 32980 1727096614.11519: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096614.11533: Calling all_plugins_play to load vars for managed_node2 32980 1727096614.11536: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096614.11539: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.12380: done sending task result for task 0afff68d-5257-457d-ef33-000000000aa0 32980 1727096614.12383: WORKER PROCESS EXITING 32980 1727096614.13021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.14532: done with get_vars() 32980 1727096614.14550: variable 'ansible_search_path' from source: unknown 32980 1727096614.14552: variable 'ansible_search_path' from source: unknown 32980 1727096614.14611: we have included files to process 32980 1727096614.14613: generating all_blocks data 32980 1727096614.14614: done generating all_blocks data 32980 1727096614.14615: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096614.14616: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096614.14619: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32980 1727096614.14878: done processing included file 32980 1727096614.14880: iterating over new_blocks loaded from include file 32980 1727096614.14882: in VariableManager get_vars() 32980 1727096614.14901: done with get_vars() 32980 1727096614.14904: filtering new block on tags 32980 1727096614.14921: done filtering new block on tags 32980 1727096614.14924: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 32980 1727096614.14928: extending task lists for all hosts with included blocks 32980 1727096614.15087: done extending task lists 32980 1727096614.15088: done processing included files 32980 1727096614.15089: results queue empty 32980 1727096614.15090: checking for any_errors_fatal 32980 1727096614.15093: done checking for any_errors_fatal 32980 1727096614.15094: checking for max_fail_percentage 32980 1727096614.15095: done checking for max_fail_percentage 32980 1727096614.15095: checking to see if all hosts have failed and the running result is not ok 32980 1727096614.15096: done checking to see if all hosts have failed 32980 1727096614.15097: getting the remaining hosts for this loop 32980 1727096614.15098: done getting the remaining hosts for this loop 32980 1727096614.15100: getting the next task for host managed_node2 32980 1727096614.15105: done getting next task for host managed_node2 32980 1727096614.15107: ^ task is: TASK: Gather current interface info 32980 1727096614.15110: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096614.15112: getting variables 32980 1727096614.15113: in VariableManager get_vars() 32980 1727096614.15125: Calling all_inventory to load vars for managed_node2 32980 1727096614.15127: Calling groups_inventory to load vars for managed_node2 32980 1727096614.15129: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096614.15133: Calling all_plugins_play to load vars for managed_node2 32980 1727096614.15135: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096614.15138: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.16288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.17797: done with get_vars() 32980 1727096614.17817: done getting variables 32980 1727096614.17857: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:03:34 -0400 (0:00:00.078) 0:00:26.105 ****** 32980 1727096614.17892: entering _queue_task() for managed_node2/command 32980 1727096614.18131: worker is 1 (out of 1 available) 32980 1727096614.18143: exiting _queue_task() for managed_node2/command 32980 1727096614.18154: done queuing things up, now waiting for results queue to drain 32980 1727096614.18156: waiting for pending results... 32980 1727096614.18422: running TaskExecutor() for managed_node2/TASK: Gather current interface info 32980 1727096614.18548: in run() - task 0afff68d-5257-457d-ef33-000000000ad7 32980 1727096614.18570: variable 'ansible_search_path' from source: unknown 32980 1727096614.18582: variable 'ansible_search_path' from source: unknown 32980 1727096614.18625: calling self._execute() 32980 1727096614.18721: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.18732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.18749: variable 'omit' from source: magic vars 32980 1727096614.19271: variable 'ansible_distribution_major_version' from source: facts 32980 1727096614.19279: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096614.19281: variable 'omit' from source: magic vars 32980 1727096614.19284: variable 'omit' from source: magic vars 32980 1727096614.19286: variable 'omit' from source: magic vars 32980 1727096614.19292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096614.19331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096614.19357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096614.19385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.19405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.19437: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096614.19446: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.19455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.19575: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096614.19588: Set connection var ansible_timeout to 10 32980 1727096614.19596: Set connection var ansible_shell_type to sh 32980 1727096614.19604: Set connection var ansible_connection to ssh 32980 1727096614.19622: Set connection var ansible_shell_executable to /bin/sh 32980 1727096614.19633: Set connection var ansible_pipelining to False 32980 1727096614.19657: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.19666: variable 'ansible_connection' from source: unknown 32980 1727096614.19680: variable 'ansible_module_compression' from source: unknown 32980 1727096614.19688: variable 'ansible_shell_type' from source: unknown 32980 1727096614.19696: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.19703: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.19711: variable 'ansible_pipelining' from source: unknown 32980 1727096614.19718: variable 'ansible_timeout' from source: unknown 32980 1727096614.19776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.19892: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096614.19975: variable 'omit' from source: magic vars 32980 1727096614.19979: starting attempt loop 32980 1727096614.19982: running the handler 32980 1727096614.19984: _low_level_execute_command(): starting 32980 1727096614.19987: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096614.20783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.20805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096614.20825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.20851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.20951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.22615: stdout chunk (state=3): >>>/root <<< 32980 1727096614.22735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.22741: stdout chunk (state=3): >>><<< 32980 1727096614.22746: stderr chunk (state=3): >>><<< 32980 1727096614.22768: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096614.22780: _low_level_execute_command(): starting 32980 1727096614.22786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041 `" && echo ansible-tmp-1727096614.2276537-34175-242421787192041="` echo /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041 `" ) && sleep 0' 32980 1727096614.23218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.23222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096614.23225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.23234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.23273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096614.23277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.23318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.25217: stdout chunk (state=3): >>>ansible-tmp-1727096614.2276537-34175-242421787192041=/root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041 <<< 32980 1727096614.25330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.25359: stderr chunk (state=3): >>><<< 32980 1727096614.25361: stdout chunk (state=3): >>><<< 32980 1727096614.25376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096614.2276537-34175-242421787192041=/root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096614.25407: variable 'ansible_module_compression' from source: unknown 32980 1727096614.25446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096614.25478: variable 'ansible_facts' from source: unknown 32980 1727096614.25533: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/AnsiballZ_command.py 32980 1727096614.25631: Sending initial data 32980 1727096614.25635: Sent initial data (156 bytes) 32980 1727096614.26084: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.26088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.26111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.26150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.27698: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32980 1727096614.27708: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096614.27726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096614.27761: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp8h3zr7ak /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/AnsiballZ_command.py <<< 32980 1727096614.27764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/AnsiballZ_command.py" <<< 32980 1727096614.27797: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp8h3zr7ak" to remote "/root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/AnsiballZ_command.py" <<< 32980 1727096614.27800: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/AnsiballZ_command.py" <<< 32980 1727096614.28280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.28312: stderr chunk (state=3): >>><<< 32980 1727096614.28317: stdout chunk (state=3): >>><<< 32980 1727096614.28334: done transferring module to remote 32980 1727096614.28342: _low_level_execute_command(): starting 32980 1727096614.28347: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/ /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/AnsiballZ_command.py && sleep 0' 32980 1727096614.28738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.28747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096614.28777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.28781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096614.28783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.28822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.28843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.28873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.30732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.30735: stdout chunk (state=3): >>><<< 32980 1727096614.30744: stderr chunk (state=3): >>><<< 32980 1727096614.30762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096614.30766: _low_level_execute_command(): starting 32980 1727096614.30776: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/AnsiballZ_command.py && sleep 0' 32980 1727096614.31213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.31216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096614.31219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096614.31222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096614.31224: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.31262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.31272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.31325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.46903: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:03:34.464602", "end": "2024-09-23 09:03:34.467954", "delta": "0:00:00.003352", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096614.48388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096614.48414: stderr chunk (state=3): >>><<< 32980 1727096614.48417: stdout chunk (state=3): >>><<< 32980 1727096614.48436: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:03:34.464602", "end": "2024-09-23 09:03:34.467954", "delta": "0:00:00.003352", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096614.48466: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096614.48478: _low_level_execute_command(): starting 32980 1727096614.48480: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096614.2276537-34175-242421787192041/ > /dev/null 2>&1 && sleep 0' 32980 1727096614.48945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096614.48948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 32980 1727096614.48950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32980 1727096614.48953: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.49007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096614.49011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.49017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.49053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.50855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.50886: stderr chunk (state=3): >>><<< 32980 1727096614.50889: stdout chunk (state=3): >>><<< 32980 1727096614.50904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096614.50909: handler run complete 32980 1727096614.50927: Evaluated conditional (False): False 32980 1727096614.50936: attempt loop complete, returning result 32980 1727096614.50938: _execute() done 32980 1727096614.50940: dumping result to json 32980 1727096614.50946: done dumping result, returning 32980 1727096614.50953: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-457d-ef33-000000000ad7] 32980 1727096614.50958: sending task result for task 0afff68d-5257-457d-ef33-000000000ad7 32980 1727096614.51057: done sending task result for task 0afff68d-5257-457d-ef33-000000000ad7 32980 1727096614.51059: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003352", "end": "2024-09-23 09:03:34.467954", "rc": 0, "start": "2024-09-23 09:03:34.464602" } STDOUT: bonding_masters eth0 lo lsr101 peerlsr101 32980 1727096614.51153: no more pending results, returning what we have 32980 1727096614.51156: results queue empty 32980 1727096614.51157: checking for any_errors_fatal 32980 1727096614.51159: done checking for any_errors_fatal 32980 1727096614.51160: checking for max_fail_percentage 32980 1727096614.51161: done checking for max_fail_percentage 32980 1727096614.51162: checking to see if all hosts have failed and the running result is not ok 32980 1727096614.51163: done checking to see if all hosts have failed 32980 1727096614.51163: getting the remaining hosts for this loop 32980 1727096614.51165: done getting the remaining hosts for this loop 32980 1727096614.51171: getting the next task for host managed_node2 32980 1727096614.51182: done getting next task for host managed_node2 32980 1727096614.51185: ^ task is: TASK: Set current_interfaces 32980 1727096614.51189: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096614.51195: getting variables 32980 1727096614.51196: in VariableManager get_vars() 32980 1727096614.51231: Calling all_inventory to load vars for managed_node2 32980 1727096614.51234: Calling groups_inventory to load vars for managed_node2 32980 1727096614.51236: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096614.51245: Calling all_plugins_play to load vars for managed_node2 32980 1727096614.51248: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096614.51250: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.52036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.52899: done with get_vars() 32980 1727096614.52914: done getting variables 32980 1727096614.52958: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:03:34 -0400 (0:00:00.350) 0:00:26.456 ****** 32980 1727096614.52984: entering _queue_task() for managed_node2/set_fact 32980 1727096614.53196: worker is 1 (out of 1 available) 32980 1727096614.53208: exiting _queue_task() for managed_node2/set_fact 32980 1727096614.53220: done queuing things up, now waiting for results queue to drain 32980 1727096614.53222: waiting for pending results... 32980 1727096614.53393: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 32980 1727096614.53482: in run() - task 0afff68d-5257-457d-ef33-000000000ad8 32980 1727096614.53492: variable 'ansible_search_path' from source: unknown 32980 1727096614.53496: variable 'ansible_search_path' from source: unknown 32980 1727096614.53522: calling self._execute() 32980 1727096614.53592: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.53596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.53607: variable 'omit' from source: magic vars 32980 1727096614.53882: variable 'ansible_distribution_major_version' from source: facts 32980 1727096614.53894: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096614.53898: variable 'omit' from source: magic vars 32980 1727096614.53928: variable 'omit' from source: magic vars 32980 1727096614.54004: variable '_current_interfaces' from source: set_fact 32980 1727096614.54052: variable 'omit' from source: magic vars 32980 1727096614.54085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096614.54113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096614.54128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096614.54141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.54151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.54177: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096614.54180: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.54183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.54253: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096614.54257: Set connection var ansible_timeout to 10 32980 1727096614.54260: Set connection var ansible_shell_type to sh 32980 1727096614.54262: Set connection var ansible_connection to ssh 32980 1727096614.54270: Set connection var ansible_shell_executable to /bin/sh 32980 1727096614.54278: Set connection var ansible_pipelining to False 32980 1727096614.54292: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.54295: variable 'ansible_connection' from source: unknown 32980 1727096614.54298: variable 'ansible_module_compression' from source: unknown 32980 1727096614.54300: variable 'ansible_shell_type' from source: unknown 32980 1727096614.54302: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.54304: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.54307: variable 'ansible_pipelining' from source: unknown 32980 1727096614.54310: variable 'ansible_timeout' from source: unknown 32980 1727096614.54314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.54415: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096614.54424: variable 'omit' from source: magic vars 32980 1727096614.54430: starting attempt loop 32980 1727096614.54433: running the handler 32980 1727096614.54444: handler run complete 32980 1727096614.54452: attempt loop complete, returning result 32980 1727096614.54454: _execute() done 32980 1727096614.54457: dumping result to json 32980 1727096614.54460: done dumping result, returning 32980 1727096614.54466: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-457d-ef33-000000000ad8] 32980 1727096614.54470: sending task result for task 0afff68d-5257-457d-ef33-000000000ad8 32980 1727096614.54548: done sending task result for task 0afff68d-5257-457d-ef33-000000000ad8 32980 1727096614.54550: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "lsr101", "peerlsr101" ] }, "changed": false } 32980 1727096614.54616: no more pending results, returning what we have 32980 1727096614.54618: results queue empty 32980 1727096614.54619: checking for any_errors_fatal 32980 1727096614.54627: done checking for any_errors_fatal 32980 1727096614.54627: checking for max_fail_percentage 32980 1727096614.54629: done checking for max_fail_percentage 32980 1727096614.54629: checking to see if all hosts have failed and the running result is not ok 32980 1727096614.54630: done checking to see if all hosts have failed 32980 1727096614.54631: getting the remaining hosts for this loop 32980 1727096614.54632: done getting the remaining hosts for this loop 32980 1727096614.54635: getting the next task for host managed_node2 32980 1727096614.54642: done getting next task for host managed_node2 32980 1727096614.54644: ^ task is: TASK: Show current_interfaces 32980 1727096614.54649: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096614.54652: getting variables 32980 1727096614.54654: in VariableManager get_vars() 32980 1727096614.54696: Calling all_inventory to load vars for managed_node2 32980 1727096614.54698: Calling groups_inventory to load vars for managed_node2 32980 1727096614.54700: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096614.54708: Calling all_plugins_play to load vars for managed_node2 32980 1727096614.54710: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096614.54713: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.58630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.59482: done with get_vars() 32980 1727096614.59497: done getting variables 32980 1727096614.59528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:03:34 -0400 (0:00:00.065) 0:00:26.522 ****** 32980 1727096614.59546: entering _queue_task() for managed_node2/debug 32980 1727096614.59797: worker is 1 (out of 1 available) 32980 1727096614.59810: exiting _queue_task() for managed_node2/debug 32980 1727096614.59822: done queuing things up, now waiting for results queue to drain 32980 1727096614.59824: waiting for pending results... 32980 1727096614.59989: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 32980 1727096614.60135: in run() - task 0afff68d-5257-457d-ef33-000000000aa1 32980 1727096614.60141: variable 'ansible_search_path' from source: unknown 32980 1727096614.60144: variable 'ansible_search_path' from source: unknown 32980 1727096614.60146: calling self._execute() 32980 1727096614.60416: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.60420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.60423: variable 'omit' from source: magic vars 32980 1727096614.60751: variable 'ansible_distribution_major_version' from source: facts 32980 1727096614.60764: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096614.60776: variable 'omit' from source: magic vars 32980 1727096614.60820: variable 'omit' from source: magic vars 32980 1727096614.60918: variable 'current_interfaces' from source: set_fact 32980 1727096614.60951: variable 'omit' from source: magic vars 32980 1727096614.60993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096614.61028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096614.61050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096614.61078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.61081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.61110: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096614.61113: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.61116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.61220: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096614.61225: Set connection var ansible_timeout to 10 32980 1727096614.61228: Set connection var ansible_shell_type to sh 32980 1727096614.61231: Set connection var ansible_connection to ssh 32980 1727096614.61241: Set connection var ansible_shell_executable to /bin/sh 32980 1727096614.61245: Set connection var ansible_pipelining to False 32980 1727096614.61272: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.61283: variable 'ansible_connection' from source: unknown 32980 1727096614.61396: variable 'ansible_module_compression' from source: unknown 32980 1727096614.61399: variable 'ansible_shell_type' from source: unknown 32980 1727096614.61402: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.61405: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.61407: variable 'ansible_pipelining' from source: unknown 32980 1727096614.61410: variable 'ansible_timeout' from source: unknown 32980 1727096614.61412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.61503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096614.61507: variable 'omit' from source: magic vars 32980 1727096614.61509: starting attempt loop 32980 1727096614.61512: running the handler 32980 1727096614.61514: handler run complete 32980 1727096614.61517: attempt loop complete, returning result 32980 1727096614.61519: _execute() done 32980 1727096614.61526: dumping result to json 32980 1727096614.61532: done dumping result, returning 32980 1727096614.61535: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-457d-ef33-000000000aa1] 32980 1727096614.61537: sending task result for task 0afff68d-5257-457d-ef33-000000000aa1 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'lsr101', 'peerlsr101'] 32980 1727096614.61729: no more pending results, returning what we have 32980 1727096614.61732: results queue empty 32980 1727096614.61732: checking for any_errors_fatal 32980 1727096614.61738: done checking for any_errors_fatal 32980 1727096614.61739: checking for max_fail_percentage 32980 1727096614.61745: done checking for max_fail_percentage 32980 1727096614.61745: checking to see if all hosts have failed and the running result is not ok 32980 1727096614.61746: done checking to see if all hosts have failed 32980 1727096614.61747: getting the remaining hosts for this loop 32980 1727096614.61748: done getting the remaining hosts for this loop 32980 1727096614.61751: getting the next task for host managed_node2 32980 1727096614.61758: done getting next task for host managed_node2 32980 1727096614.61760: ^ task is: TASK: Install iproute 32980 1727096614.61763: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096614.61766: getting variables 32980 1727096614.61770: in VariableManager get_vars() 32980 1727096614.61805: Calling all_inventory to load vars for managed_node2 32980 1727096614.61807: Calling groups_inventory to load vars for managed_node2 32980 1727096614.61809: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096614.61817: Calling all_plugins_play to load vars for managed_node2 32980 1727096614.61819: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096614.61822: Calling groups_plugins_play to load vars for managed_node2 32980 1727096614.62365: done sending task result for task 0afff68d-5257-457d-ef33-000000000aa1 32980 1727096614.62372: WORKER PROCESS EXITING 32980 1727096614.63234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096614.65523: done with get_vars() 32980 1727096614.65546: done getting variables 32980 1727096614.65688: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 09:03:34 -0400 (0:00:00.062) 0:00:26.584 ****** 32980 1727096614.65756: entering _queue_task() for managed_node2/package 32980 1727096614.66015: worker is 1 (out of 1 available) 32980 1727096614.66028: exiting _queue_task() for managed_node2/package 32980 1727096614.66041: done queuing things up, now waiting for results queue to drain 32980 1727096614.66042: waiting for pending results... 32980 1727096614.66216: running TaskExecutor() for managed_node2/TASK: Install iproute 32980 1727096614.66295: in run() - task 0afff68d-5257-457d-ef33-00000000093f 32980 1727096614.66306: variable 'ansible_search_path' from source: unknown 32980 1727096614.66309: variable 'ansible_search_path' from source: unknown 32980 1727096614.66337: calling self._execute() 32980 1727096614.66414: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.66418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.66427: variable 'omit' from source: magic vars 32980 1727096614.66713: variable 'ansible_distribution_major_version' from source: facts 32980 1727096614.66724: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096614.66729: variable 'omit' from source: magic vars 32980 1727096614.66751: variable 'omit' from source: magic vars 32980 1727096614.66882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32980 1727096614.68803: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32980 1727096614.68911: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32980 1727096614.68915: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32980 1727096614.68934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32980 1727096614.68959: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32980 1727096614.69049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32980 1727096614.69077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32980 1727096614.69129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32980 1727096614.69136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32980 1727096614.69150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32980 1727096614.69242: variable '__network_is_ostree' from source: set_fact 32980 1727096614.69248: variable 'omit' from source: magic vars 32980 1727096614.69348: variable 'omit' from source: magic vars 32980 1727096614.69351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096614.69355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096614.69357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096614.69360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.69370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096614.69399: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096614.69402: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.69406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.69502: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096614.69508: Set connection var ansible_timeout to 10 32980 1727096614.69511: Set connection var ansible_shell_type to sh 32980 1727096614.69514: Set connection var ansible_connection to ssh 32980 1727096614.69520: Set connection var ansible_shell_executable to /bin/sh 32980 1727096614.69525: Set connection var ansible_pipelining to False 32980 1727096614.69546: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.69548: variable 'ansible_connection' from source: unknown 32980 1727096614.69551: variable 'ansible_module_compression' from source: unknown 32980 1727096614.69553: variable 'ansible_shell_type' from source: unknown 32980 1727096614.69556: variable 'ansible_shell_executable' from source: unknown 32980 1727096614.69675: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096614.69679: variable 'ansible_pipelining' from source: unknown 32980 1727096614.69682: variable 'ansible_timeout' from source: unknown 32980 1727096614.69684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096614.69688: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096614.69691: variable 'omit' from source: magic vars 32980 1727096614.69693: starting attempt loop 32980 1727096614.69695: running the handler 32980 1727096614.69697: variable 'ansible_facts' from source: unknown 32980 1727096614.69699: variable 'ansible_facts' from source: unknown 32980 1727096614.69780: _low_level_execute_command(): starting 32980 1727096614.69783: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096614.70358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096614.70375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.70434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096614.70438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096614.70441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096614.70444: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096614.70446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.70448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096614.70451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096614.70453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096614.70455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.70457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096614.70466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096614.70478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096614.70480: stderr chunk (state=3): >>>debug2: match found <<< 32980 1727096614.70491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.70577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096614.70589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.70592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.70658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.72348: stdout chunk (state=3): >>>/root <<< 32980 1727096614.72496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.72516: stderr chunk (state=3): >>><<< 32980 1727096614.72539: stdout chunk (state=3): >>><<< 32980 1727096614.72663: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096614.72666: _low_level_execute_command(): starting 32980 1727096614.72671: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761 `" && echo ansible-tmp-1727096614.7257211-34199-152585278648761="` echo /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761 `" ) && sleep 0' 32980 1727096614.73252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096614.73267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.73294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096614.73312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096614.73423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.73458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.73522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.75450: stdout chunk (state=3): >>>ansible-tmp-1727096614.7257211-34199-152585278648761=/root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761 <<< 32980 1727096614.75602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.75680: stdout chunk (state=3): >>><<< 32980 1727096614.75684: stderr chunk (state=3): >>><<< 32980 1727096614.75687: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096614.7257211-34199-152585278648761=/root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096614.75695: variable 'ansible_module_compression' from source: unknown 32980 1727096614.75765: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 32980 1727096614.75836: variable 'ansible_facts' from source: unknown 32980 1727096614.75970: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/AnsiballZ_dnf.py 32980 1727096614.76131: Sending initial data 32980 1727096614.76134: Sent initial data (152 bytes) 32980 1727096614.76799: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.76849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.76890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.78516: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32980 1727096614.78555: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096614.78595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096614.78675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpti3q9o8k /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/AnsiballZ_dnf.py <<< 32980 1727096614.78679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/AnsiballZ_dnf.py" <<< 32980 1727096614.78738: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpti3q9o8k" to remote "/root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/AnsiballZ_dnf.py" <<< 32980 1727096614.79818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.79976: stderr chunk (state=3): >>><<< 32980 1727096614.79980: stdout chunk (state=3): >>><<< 32980 1727096614.79983: done transferring module to remote 32980 1727096614.79985: _low_level_execute_command(): starting 32980 1727096614.79987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/ /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/AnsiballZ_dnf.py && sleep 0' 32980 1727096614.80728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096614.80793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.80899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096614.80919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.80934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.81009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096614.82915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096614.82919: stdout chunk (state=3): >>><<< 32980 1727096614.82983: stderr chunk (state=3): >>><<< 32980 1727096614.82987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096614.82990: _low_level_execute_command(): starting 32980 1727096614.82993: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/AnsiballZ_dnf.py && sleep 0' 32980 1727096614.83602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096614.83610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.83628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096614.83641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096614.83653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096614.83776: stderr chunk (state=3): >>>debug2: match not found <<< 32980 1727096614.83779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.83782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32980 1727096614.83784: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 32980 1727096614.83786: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32980 1727096614.83788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096614.83790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096614.83792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096614.83793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096614.83795: stderr chunk (state=3): >>>debug2: match found <<< 32980 1727096614.83797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096614.83819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096614.83843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096614.83864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096614.83927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.25561: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 32980 1727096615.29734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096615.29763: stderr chunk (state=3): >>><<< 32980 1727096615.29767: stdout chunk (state=3): >>><<< 32980 1727096615.29787: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096615.29821: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096615.29827: _low_level_execute_command(): starting 32980 1727096615.29832: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096614.7257211-34199-152585278648761/ > /dev/null 2>&1 && sleep 0' 32980 1727096615.30272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096615.30278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096615.30309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096615.30312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096615.30314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096615.30316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096615.30377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096615.30382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096615.30384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096615.30416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.32238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096615.32269: stderr chunk (state=3): >>><<< 32980 1727096615.32272: stdout chunk (state=3): >>><<< 32980 1727096615.32289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096615.32294: handler run complete 32980 1727096615.32411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32980 1727096615.32554: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32980 1727096615.32590: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32980 1727096615.32613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32980 1727096615.32636: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32980 1727096615.32697: variable '__install_status' from source: set_fact 32980 1727096615.32712: Evaluated conditional (__install_status is success): True 32980 1727096615.32723: attempt loop complete, returning result 32980 1727096615.32726: _execute() done 32980 1727096615.32728: dumping result to json 32980 1727096615.32733: done dumping result, returning 32980 1727096615.32740: done running TaskExecutor() for managed_node2/TASK: Install iproute [0afff68d-5257-457d-ef33-00000000093f] 32980 1727096615.32742: sending task result for task 0afff68d-5257-457d-ef33-00000000093f 32980 1727096615.32837: done sending task result for task 0afff68d-5257-457d-ef33-00000000093f 32980 1727096615.32839: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 32980 1727096615.32921: no more pending results, returning what we have 32980 1727096615.32924: results queue empty 32980 1727096615.32925: checking for any_errors_fatal 32980 1727096615.32930: done checking for any_errors_fatal 32980 1727096615.32931: checking for max_fail_percentage 32980 1727096615.32933: done checking for max_fail_percentage 32980 1727096615.32933: checking to see if all hosts have failed and the running result is not ok 32980 1727096615.32934: done checking to see if all hosts have failed 32980 1727096615.32935: getting the remaining hosts for this loop 32980 1727096615.32936: done getting the remaining hosts for this loop 32980 1727096615.32940: getting the next task for host managed_node2 32980 1727096615.32947: done getting next task for host managed_node2 32980 1727096615.32952: ^ task is: TASK: Create veth interface {{ interface }} 32980 1727096615.32955: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096615.32958: getting variables 32980 1727096615.32959: in VariableManager get_vars() 32980 1727096615.33002: Calling all_inventory to load vars for managed_node2 32980 1727096615.33005: Calling groups_inventory to load vars for managed_node2 32980 1727096615.33007: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096615.33017: Calling all_plugins_play to load vars for managed_node2 32980 1727096615.33019: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096615.33022: Calling groups_plugins_play to load vars for managed_node2 32980 1727096615.33824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096615.34699: done with get_vars() 32980 1727096615.34716: done getting variables 32980 1727096615.34760: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096615.34855: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 09:03:35 -0400 (0:00:00.691) 0:00:27.275 ****** 32980 1727096615.34881: entering _queue_task() for managed_node2/command 32980 1727096615.35130: worker is 1 (out of 1 available) 32980 1727096615.35142: exiting _queue_task() for managed_node2/command 32980 1727096615.35155: done queuing things up, now waiting for results queue to drain 32980 1727096615.35156: waiting for pending results... 32980 1727096615.35338: running TaskExecutor() for managed_node2/TASK: Create veth interface lsr101 32980 1727096615.35418: in run() - task 0afff68d-5257-457d-ef33-000000000940 32980 1727096615.35429: variable 'ansible_search_path' from source: unknown 32980 1727096615.35432: variable 'ansible_search_path' from source: unknown 32980 1727096615.35647: variable 'interface' from source: play vars 32980 1727096615.35713: variable 'interface' from source: play vars 32980 1727096615.35760: variable 'interface' from source: play vars 32980 1727096615.35878: Loaded config def from plugin (lookup/items) 32980 1727096615.35885: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 32980 1727096615.35904: variable 'omit' from source: magic vars 32980 1727096615.36005: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.36014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.36025: variable 'omit' from source: magic vars 32980 1727096615.36197: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.36203: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.36337: variable 'type' from source: play vars 32980 1727096615.36340: variable 'state' from source: include params 32980 1727096615.36343: variable 'interface' from source: play vars 32980 1727096615.36347: variable 'current_interfaces' from source: set_fact 32980 1727096615.36355: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 32980 1727096615.36359: when evaluation is False, skipping this task 32980 1727096615.36388: variable 'item' from source: unknown 32980 1727096615.36435: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add lsr101 type veth peer name peerlsr101", "skip_reason": "Conditional result was False" } 32980 1727096615.36581: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.36584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.36586: variable 'omit' from source: magic vars 32980 1727096615.36644: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.36647: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.36763: variable 'type' from source: play vars 32980 1727096615.36769: variable 'state' from source: include params 32980 1727096615.36772: variable 'interface' from source: play vars 32980 1727096615.36778: variable 'current_interfaces' from source: set_fact 32980 1727096615.36784: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 32980 1727096615.36786: when evaluation is False, skipping this task 32980 1727096615.36807: variable 'item' from source: unknown 32980 1727096615.36849: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerlsr101 up", "skip_reason": "Conditional result was False" } 32980 1727096615.36925: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.36928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.36931: variable 'omit' from source: magic vars 32980 1727096615.37028: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.37032: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.37147: variable 'type' from source: play vars 32980 1727096615.37152: variable 'state' from source: include params 32980 1727096615.37155: variable 'interface' from source: play vars 32980 1727096615.37157: variable 'current_interfaces' from source: set_fact 32980 1727096615.37169: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 32980 1727096615.37172: when evaluation is False, skipping this task 32980 1727096615.37189: variable 'item' from source: unknown 32980 1727096615.37231: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set lsr101 up", "skip_reason": "Conditional result was False" } 32980 1727096615.37300: dumping result to json 32980 1727096615.37302: done dumping result, returning 32980 1727096615.37304: done running TaskExecutor() for managed_node2/TASK: Create veth interface lsr101 [0afff68d-5257-457d-ef33-000000000940] 32980 1727096615.37306: sending task result for task 0afff68d-5257-457d-ef33-000000000940 32980 1727096615.37338: done sending task result for task 0afff68d-5257-457d-ef33-000000000940 32980 1727096615.37340: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 32980 1727096615.37375: no more pending results, returning what we have 32980 1727096615.37377: results queue empty 32980 1727096615.37378: checking for any_errors_fatal 32980 1727096615.37387: done checking for any_errors_fatal 32980 1727096615.37388: checking for max_fail_percentage 32980 1727096615.37389: done checking for max_fail_percentage 32980 1727096615.37390: checking to see if all hosts have failed and the running result is not ok 32980 1727096615.37391: done checking to see if all hosts have failed 32980 1727096615.37392: getting the remaining hosts for this loop 32980 1727096615.37393: done getting the remaining hosts for this loop 32980 1727096615.37396: getting the next task for host managed_node2 32980 1727096615.37404: done getting next task for host managed_node2 32980 1727096615.37406: ^ task is: TASK: Set up veth as managed by NetworkManager 32980 1727096615.37408: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096615.37412: getting variables 32980 1727096615.37413: in VariableManager get_vars() 32980 1727096615.37452: Calling all_inventory to load vars for managed_node2 32980 1727096615.37454: Calling groups_inventory to load vars for managed_node2 32980 1727096615.37457: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096615.37466: Calling all_plugins_play to load vars for managed_node2 32980 1727096615.37470: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096615.37473: Calling groups_plugins_play to load vars for managed_node2 32980 1727096615.38388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096615.39235: done with get_vars() 32980 1727096615.39250: done getting variables 32980 1727096615.39293: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 09:03:35 -0400 (0:00:00.044) 0:00:27.319 ****** 32980 1727096615.39318: entering _queue_task() for managed_node2/command 32980 1727096615.39532: worker is 1 (out of 1 available) 32980 1727096615.39545: exiting _queue_task() for managed_node2/command 32980 1727096615.39557: done queuing things up, now waiting for results queue to drain 32980 1727096615.39559: waiting for pending results... 32980 1727096615.39731: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 32980 1727096615.39804: in run() - task 0afff68d-5257-457d-ef33-000000000941 32980 1727096615.39814: variable 'ansible_search_path' from source: unknown 32980 1727096615.39817: variable 'ansible_search_path' from source: unknown 32980 1727096615.39845: calling self._execute() 32980 1727096615.39925: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.39931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.39941: variable 'omit' from source: magic vars 32980 1727096615.40214: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.40228: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.40330: variable 'type' from source: play vars 32980 1727096615.40333: variable 'state' from source: include params 32980 1727096615.40337: Evaluated conditional (type == 'veth' and state == 'present'): False 32980 1727096615.40347: when evaluation is False, skipping this task 32980 1727096615.40350: _execute() done 32980 1727096615.40353: dumping result to json 32980 1727096615.40355: done dumping result, returning 32980 1727096615.40358: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-457d-ef33-000000000941] 32980 1727096615.40360: sending task result for task 0afff68d-5257-457d-ef33-000000000941 32980 1727096615.40442: done sending task result for task 0afff68d-5257-457d-ef33-000000000941 32980 1727096615.40444: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 32980 1727096615.40497: no more pending results, returning what we have 32980 1727096615.40500: results queue empty 32980 1727096615.40501: checking for any_errors_fatal 32980 1727096615.40510: done checking for any_errors_fatal 32980 1727096615.40511: checking for max_fail_percentage 32980 1727096615.40512: done checking for max_fail_percentage 32980 1727096615.40513: checking to see if all hosts have failed and the running result is not ok 32980 1727096615.40514: done checking to see if all hosts have failed 32980 1727096615.40515: getting the remaining hosts for this loop 32980 1727096615.40516: done getting the remaining hosts for this loop 32980 1727096615.40519: getting the next task for host managed_node2 32980 1727096615.40526: done getting next task for host managed_node2 32980 1727096615.40528: ^ task is: TASK: Delete veth interface {{ interface }} 32980 1727096615.40530: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096615.40533: getting variables 32980 1727096615.40535: in VariableManager get_vars() 32980 1727096615.40571: Calling all_inventory to load vars for managed_node2 32980 1727096615.40574: Calling groups_inventory to load vars for managed_node2 32980 1727096615.40576: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096615.40586: Calling all_plugins_play to load vars for managed_node2 32980 1727096615.40588: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096615.40590: Calling groups_plugins_play to load vars for managed_node2 32980 1727096615.41327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096615.42197: done with get_vars() 32980 1727096615.42210: done getting variables 32980 1727096615.42251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096615.42331: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 09:03:35 -0400 (0:00:00.030) 0:00:27.350 ****** 32980 1727096615.42351: entering _queue_task() for managed_node2/command 32980 1727096615.42560: worker is 1 (out of 1 available) 32980 1727096615.42575: exiting _queue_task() for managed_node2/command 32980 1727096615.42587: done queuing things up, now waiting for results queue to drain 32980 1727096615.42588: waiting for pending results... 32980 1727096615.42756: running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr101 32980 1727096615.42856: in run() - task 0afff68d-5257-457d-ef33-000000000942 32980 1727096615.42861: variable 'ansible_search_path' from source: unknown 32980 1727096615.42863: variable 'ansible_search_path' from source: unknown 32980 1727096615.42891: calling self._execute() 32980 1727096615.42972: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.42976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.42986: variable 'omit' from source: magic vars 32980 1727096615.43255: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.43259: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.43391: variable 'type' from source: play vars 32980 1727096615.43395: variable 'state' from source: include params 32980 1727096615.43399: variable 'interface' from source: play vars 32980 1727096615.43402: variable 'current_interfaces' from source: set_fact 32980 1727096615.43411: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 32980 1727096615.43415: variable 'omit' from source: magic vars 32980 1727096615.43440: variable 'omit' from source: magic vars 32980 1727096615.43511: variable 'interface' from source: play vars 32980 1727096615.43526: variable 'omit' from source: magic vars 32980 1727096615.43559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096615.43592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096615.43608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096615.43621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096615.43630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096615.43653: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096615.43656: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.43658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.43733: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096615.43736: Set connection var ansible_timeout to 10 32980 1727096615.43739: Set connection var ansible_shell_type to sh 32980 1727096615.43741: Set connection var ansible_connection to ssh 32980 1727096615.43747: Set connection var ansible_shell_executable to /bin/sh 32980 1727096615.43752: Set connection var ansible_pipelining to False 32980 1727096615.43769: variable 'ansible_shell_executable' from source: unknown 32980 1727096615.43772: variable 'ansible_connection' from source: unknown 32980 1727096615.43775: variable 'ansible_module_compression' from source: unknown 32980 1727096615.43777: variable 'ansible_shell_type' from source: unknown 32980 1727096615.43780: variable 'ansible_shell_executable' from source: unknown 32980 1727096615.43786: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.43788: variable 'ansible_pipelining' from source: unknown 32980 1727096615.43790: variable 'ansible_timeout' from source: unknown 32980 1727096615.43794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.43896: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096615.43911: variable 'omit' from source: magic vars 32980 1727096615.43914: starting attempt loop 32980 1727096615.43917: running the handler 32980 1727096615.43927: _low_level_execute_command(): starting 32980 1727096615.43934: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096615.44443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096615.44449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096615.44453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096615.44455: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096615.44510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096615.44513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096615.44520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096615.44575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.46240: stdout chunk (state=3): >>>/root <<< 32980 1727096615.46339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096615.46364: stderr chunk (state=3): >>><<< 32980 1727096615.46374: stdout chunk (state=3): >>><<< 32980 1727096615.46394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096615.46404: _low_level_execute_command(): starting 32980 1727096615.46410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491 `" && echo ansible-tmp-1727096615.4639297-34227-122929977544491="` echo /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491 `" ) && sleep 0' 32980 1727096615.46843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096615.46855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096615.46858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096615.46860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096615.46862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096615.46900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096615.46903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096615.46942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.48838: stdout chunk (state=3): >>>ansible-tmp-1727096615.4639297-34227-122929977544491=/root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491 <<< 32980 1727096615.48999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096615.49003: stdout chunk (state=3): >>><<< 32980 1727096615.49005: stderr chunk (state=3): >>><<< 32980 1727096615.49172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096615.4639297-34227-122929977544491=/root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096615.49176: variable 'ansible_module_compression' from source: unknown 32980 1727096615.49178: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096615.49180: variable 'ansible_facts' from source: unknown 32980 1727096615.49253: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/AnsiballZ_command.py 32980 1727096615.49388: Sending initial data 32980 1727096615.49410: Sent initial data (156 bytes) 32980 1727096615.49995: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096615.50008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096615.50020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096615.50035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096615.50049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096615.50151: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096615.50183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096615.50252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.51852: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096615.51909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096615.51957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpriwvzfhh" to remote "/root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/AnsiballZ_command.py" <<< 32980 1727096615.51961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpriwvzfhh /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/AnsiballZ_command.py <<< 32980 1727096615.52751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096615.52754: stdout chunk (state=3): >>><<< 32980 1727096615.52757: stderr chunk (state=3): >>><<< 32980 1727096615.52759: done transferring module to remote 32980 1727096615.52761: _low_level_execute_command(): starting 32980 1727096615.52763: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/ /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/AnsiballZ_command.py && sleep 0' 32980 1727096615.53379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096615.53394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096615.53437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096615.53456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096615.53545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096615.53562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096615.53583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096615.53639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.55461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096615.55487: stdout chunk (state=3): >>><<< 32980 1727096615.55490: stderr chunk (state=3): >>><<< 32980 1727096615.55505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096615.55594: _low_level_execute_command(): starting 32980 1727096615.55598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/AnsiballZ_command.py && sleep 0' 32980 1727096615.56117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096615.56130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096615.56143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096615.56158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096615.56183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 32980 1727096615.56279: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096615.56299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096615.56313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096615.56378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.72540: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-23 09:03:35.713550", "end": "2024-09-23 09:03:35.724277", "delta": "0:00:00.010727", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096615.74328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096615.74357: stdout chunk (state=3): >>><<< 32980 1727096615.74376: stderr chunk (state=3): >>><<< 32980 1727096615.74600: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-23 09:03:35.713550", "end": "2024-09-23 09:03:35.724277", "delta": "0:00:00.010727", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096615.74604: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr101 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096615.74607: _low_level_execute_command(): starting 32980 1727096615.74609: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096615.4639297-34227-122929977544491/ > /dev/null 2>&1 && sleep 0' 32980 1727096615.75481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096615.75558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096615.75583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096615.75598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096615.75657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096615.77721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096615.77775: stderr chunk (state=3): >>><<< 32980 1727096615.77785: stdout chunk (state=3): >>><<< 32980 1727096615.77814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096615.77825: handler run complete 32980 1727096615.77853: Evaluated conditional (False): False 32980 1727096615.77883: attempt loop complete, returning result 32980 1727096615.77890: _execute() done 32980 1727096615.77892: dumping result to json 32980 1727096615.77973: done dumping result, returning 32980 1727096615.77976: done running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr101 [0afff68d-5257-457d-ef33-000000000942] 32980 1727096615.77978: sending task result for task 0afff68d-5257-457d-ef33-000000000942 32980 1727096615.78228: done sending task result for task 0afff68d-5257-457d-ef33-000000000942 32980 1727096615.78231: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr101", "type", "veth" ], "delta": "0:00:00.010727", "end": "2024-09-23 09:03:35.724277", "rc": 0, "start": "2024-09-23 09:03:35.713550" } 32980 1727096615.78327: no more pending results, returning what we have 32980 1727096615.78330: results queue empty 32980 1727096615.78331: checking for any_errors_fatal 32980 1727096615.78337: done checking for any_errors_fatal 32980 1727096615.78338: checking for max_fail_percentage 32980 1727096615.78340: done checking for max_fail_percentage 32980 1727096615.78341: checking to see if all hosts have failed and the running result is not ok 32980 1727096615.78341: done checking to see if all hosts have failed 32980 1727096615.78342: getting the remaining hosts for this loop 32980 1727096615.78344: done getting the remaining hosts for this loop 32980 1727096615.78347: getting the next task for host managed_node2 32980 1727096615.78357: done getting next task for host managed_node2 32980 1727096615.78361: ^ task is: TASK: Create dummy interface {{ interface }} 32980 1727096615.78365: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096615.78597: getting variables 32980 1727096615.78600: in VariableManager get_vars() 32980 1727096615.78641: Calling all_inventory to load vars for managed_node2 32980 1727096615.78644: Calling groups_inventory to load vars for managed_node2 32980 1727096615.78646: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096615.78657: Calling all_plugins_play to load vars for managed_node2 32980 1727096615.78659: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096615.78662: Calling groups_plugins_play to load vars for managed_node2 32980 1727096615.80740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096615.82371: done with get_vars() 32980 1727096615.82396: done getting variables 32980 1727096615.82456: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096615.82575: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 09:03:35 -0400 (0:00:00.402) 0:00:27.752 ****** 32980 1727096615.82608: entering _queue_task() for managed_node2/command 32980 1727096615.82976: worker is 1 (out of 1 available) 32980 1727096615.82988: exiting _queue_task() for managed_node2/command 32980 1727096615.82999: done queuing things up, now waiting for results queue to drain 32980 1727096615.83000: waiting for pending results... 32980 1727096615.83487: running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr101 32980 1727096615.83492: in run() - task 0afff68d-5257-457d-ef33-000000000943 32980 1727096615.83495: variable 'ansible_search_path' from source: unknown 32980 1727096615.83497: variable 'ansible_search_path' from source: unknown 32980 1727096615.83500: calling self._execute() 32980 1727096615.83595: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.83606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.83622: variable 'omit' from source: magic vars 32980 1727096615.84004: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.84026: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.84239: variable 'type' from source: play vars 32980 1727096615.84249: variable 'state' from source: include params 32980 1727096615.84258: variable 'interface' from source: play vars 32980 1727096615.84263: variable 'current_interfaces' from source: set_fact 32980 1727096615.84279: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 32980 1727096615.84286: when evaluation is False, skipping this task 32980 1727096615.84292: _execute() done 32980 1727096615.84297: dumping result to json 32980 1727096615.84302: done dumping result, returning 32980 1727096615.84309: done running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr101 [0afff68d-5257-457d-ef33-000000000943] 32980 1727096615.84316: sending task result for task 0afff68d-5257-457d-ef33-000000000943 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096615.84505: no more pending results, returning what we have 32980 1727096615.84509: results queue empty 32980 1727096615.84510: checking for any_errors_fatal 32980 1727096615.84523: done checking for any_errors_fatal 32980 1727096615.84524: checking for max_fail_percentage 32980 1727096615.84526: done checking for max_fail_percentage 32980 1727096615.84527: checking to see if all hosts have failed and the running result is not ok 32980 1727096615.84528: done checking to see if all hosts have failed 32980 1727096615.84529: getting the remaining hosts for this loop 32980 1727096615.84530: done getting the remaining hosts for this loop 32980 1727096615.84534: getting the next task for host managed_node2 32980 1727096615.84545: done getting next task for host managed_node2 32980 1727096615.84548: ^ task is: TASK: Delete dummy interface {{ interface }} 32980 1727096615.84552: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096615.84556: getting variables 32980 1727096615.84558: in VariableManager get_vars() 32980 1727096615.84609: Calling all_inventory to load vars for managed_node2 32980 1727096615.84612: Calling groups_inventory to load vars for managed_node2 32980 1727096615.84615: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096615.84629: Calling all_plugins_play to load vars for managed_node2 32980 1727096615.84632: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096615.84635: Calling groups_plugins_play to load vars for managed_node2 32980 1727096615.85323: done sending task result for task 0afff68d-5257-457d-ef33-000000000943 32980 1727096615.85326: WORKER PROCESS EXITING 32980 1727096615.86271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096615.87798: done with get_vars() 32980 1727096615.87825: done getting variables 32980 1727096615.87894: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096615.88010: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 09:03:35 -0400 (0:00:00.054) 0:00:27.807 ****** 32980 1727096615.88041: entering _queue_task() for managed_node2/command 32980 1727096615.88405: worker is 1 (out of 1 available) 32980 1727096615.88418: exiting _queue_task() for managed_node2/command 32980 1727096615.88430: done queuing things up, now waiting for results queue to drain 32980 1727096615.88431: waiting for pending results... 32980 1727096615.88724: running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr101 32980 1727096615.88844: in run() - task 0afff68d-5257-457d-ef33-000000000944 32980 1727096615.88862: variable 'ansible_search_path' from source: unknown 32980 1727096615.88872: variable 'ansible_search_path' from source: unknown 32980 1727096615.88918: calling self._execute() 32980 1727096615.89031: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.89042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.89058: variable 'omit' from source: magic vars 32980 1727096615.89435: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.89453: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.89665: variable 'type' from source: play vars 32980 1727096615.89682: variable 'state' from source: include params 32980 1727096615.89692: variable 'interface' from source: play vars 32980 1727096615.89699: variable 'current_interfaces' from source: set_fact 32980 1727096615.89712: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 32980 1727096615.89718: when evaluation is False, skipping this task 32980 1727096615.89724: _execute() done 32980 1727096615.89730: dumping result to json 32980 1727096615.89736: done dumping result, returning 32980 1727096615.89744: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr101 [0afff68d-5257-457d-ef33-000000000944] 32980 1727096615.89752: sending task result for task 0afff68d-5257-457d-ef33-000000000944 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096615.90018: no more pending results, returning what we have 32980 1727096615.90022: results queue empty 32980 1727096615.90023: checking for any_errors_fatal 32980 1727096615.90030: done checking for any_errors_fatal 32980 1727096615.90030: checking for max_fail_percentage 32980 1727096615.90032: done checking for max_fail_percentage 32980 1727096615.90033: checking to see if all hosts have failed and the running result is not ok 32980 1727096615.90034: done checking to see if all hosts have failed 32980 1727096615.90035: getting the remaining hosts for this loop 32980 1727096615.90037: done getting the remaining hosts for this loop 32980 1727096615.90041: getting the next task for host managed_node2 32980 1727096615.90050: done getting next task for host managed_node2 32980 1727096615.90053: ^ task is: TASK: Create tap interface {{ interface }} 32980 1727096615.90057: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096615.90060: getting variables 32980 1727096615.90062: in VariableManager get_vars() 32980 1727096615.90113: Calling all_inventory to load vars for managed_node2 32980 1727096615.90116: Calling groups_inventory to load vars for managed_node2 32980 1727096615.90119: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096615.90133: Calling all_plugins_play to load vars for managed_node2 32980 1727096615.90136: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096615.90139: Calling groups_plugins_play to load vars for managed_node2 32980 1727096615.90931: done sending task result for task 0afff68d-5257-457d-ef33-000000000944 32980 1727096615.90934: WORKER PROCESS EXITING 32980 1727096615.91834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096615.93351: done with get_vars() 32980 1727096615.93377: done getting variables 32980 1727096615.93433: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096615.93550: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 09:03:35 -0400 (0:00:00.055) 0:00:27.862 ****** 32980 1727096615.93588: entering _queue_task() for managed_node2/command 32980 1727096615.93931: worker is 1 (out of 1 available) 32980 1727096615.93943: exiting _queue_task() for managed_node2/command 32980 1727096615.93955: done queuing things up, now waiting for results queue to drain 32980 1727096615.93956: waiting for pending results... 32980 1727096615.94236: running TaskExecutor() for managed_node2/TASK: Create tap interface lsr101 32980 1727096615.94347: in run() - task 0afff68d-5257-457d-ef33-000000000945 32980 1727096615.94365: variable 'ansible_search_path' from source: unknown 32980 1727096615.94377: variable 'ansible_search_path' from source: unknown 32980 1727096615.94419: calling self._execute() 32980 1727096615.94528: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096615.94540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096615.94555: variable 'omit' from source: magic vars 32980 1727096615.94929: variable 'ansible_distribution_major_version' from source: facts 32980 1727096615.94948: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096615.95162: variable 'type' from source: play vars 32980 1727096615.95177: variable 'state' from source: include params 32980 1727096615.95188: variable 'interface' from source: play vars 32980 1727096615.95196: variable 'current_interfaces' from source: set_fact 32980 1727096615.95209: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 32980 1727096615.95216: when evaluation is False, skipping this task 32980 1727096615.95223: _execute() done 32980 1727096615.95229: dumping result to json 32980 1727096615.95235: done dumping result, returning 32980 1727096615.95244: done running TaskExecutor() for managed_node2/TASK: Create tap interface lsr101 [0afff68d-5257-457d-ef33-000000000945] 32980 1727096615.95251: sending task result for task 0afff68d-5257-457d-ef33-000000000945 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096615.95414: no more pending results, returning what we have 32980 1727096615.95418: results queue empty 32980 1727096615.95419: checking for any_errors_fatal 32980 1727096615.95425: done checking for any_errors_fatal 32980 1727096615.95426: checking for max_fail_percentage 32980 1727096615.95428: done checking for max_fail_percentage 32980 1727096615.95429: checking to see if all hosts have failed and the running result is not ok 32980 1727096615.95430: done checking to see if all hosts have failed 32980 1727096615.95430: getting the remaining hosts for this loop 32980 1727096615.95432: done getting the remaining hosts for this loop 32980 1727096615.95436: getting the next task for host managed_node2 32980 1727096615.95444: done getting next task for host managed_node2 32980 1727096615.95447: ^ task is: TASK: Delete tap interface {{ interface }} 32980 1727096615.95452: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096615.95456: getting variables 32980 1727096615.95457: in VariableManager get_vars() 32980 1727096615.95504: Calling all_inventory to load vars for managed_node2 32980 1727096615.95507: Calling groups_inventory to load vars for managed_node2 32980 1727096615.95510: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096615.95523: Calling all_plugins_play to load vars for managed_node2 32980 1727096615.95527: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096615.95531: Calling groups_plugins_play to load vars for managed_node2 32980 1727096615.96181: done sending task result for task 0afff68d-5257-457d-ef33-000000000945 32980 1727096615.96185: WORKER PROCESS EXITING 32980 1727096615.97107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096615.98803: done with get_vars() 32980 1727096615.98823: done getting variables 32980 1727096615.98884: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32980 1727096615.98993: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 09:03:35 -0400 (0:00:00.054) 0:00:27.916 ****** 32980 1727096615.99022: entering _queue_task() for managed_node2/command 32980 1727096615.99354: worker is 1 (out of 1 available) 32980 1727096615.99366: exiting _queue_task() for managed_node2/command 32980 1727096615.99382: done queuing things up, now waiting for results queue to drain 32980 1727096615.99384: waiting for pending results... 32980 1727096615.99658: running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr101 32980 1727096615.99961: in run() - task 0afff68d-5257-457d-ef33-000000000946 32980 1727096616.00182: variable 'ansible_search_path' from source: unknown 32980 1727096616.00186: variable 'ansible_search_path' from source: unknown 32980 1727096616.00188: calling self._execute() 32980 1727096616.00330: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.00342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.00358: variable 'omit' from source: magic vars 32980 1727096616.01059: variable 'ansible_distribution_major_version' from source: facts 32980 1727096616.01085: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096616.01302: variable 'type' from source: play vars 32980 1727096616.01311: variable 'state' from source: include params 32980 1727096616.01319: variable 'interface' from source: play vars 32980 1727096616.01325: variable 'current_interfaces' from source: set_fact 32980 1727096616.01391: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 32980 1727096616.01394: when evaluation is False, skipping this task 32980 1727096616.01397: _execute() done 32980 1727096616.01398: dumping result to json 32980 1727096616.01400: done dumping result, returning 32980 1727096616.01402: done running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr101 [0afff68d-5257-457d-ef33-000000000946] 32980 1727096616.01404: sending task result for task 0afff68d-5257-457d-ef33-000000000946 32980 1727096616.01462: done sending task result for task 0afff68d-5257-457d-ef33-000000000946 32980 1727096616.01465: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32980 1727096616.01546: no more pending results, returning what we have 32980 1727096616.01551: results queue empty 32980 1727096616.01552: checking for any_errors_fatal 32980 1727096616.01559: done checking for any_errors_fatal 32980 1727096616.01560: checking for max_fail_percentage 32980 1727096616.01562: done checking for max_fail_percentage 32980 1727096616.01563: checking to see if all hosts have failed and the running result is not ok 32980 1727096616.01564: done checking to see if all hosts have failed 32980 1727096616.01565: getting the remaining hosts for this loop 32980 1727096616.01566: done getting the remaining hosts for this loop 32980 1727096616.01575: getting the next task for host managed_node2 32980 1727096616.01588: done getting next task for host managed_node2 32980 1727096616.01592: ^ task is: TASK: Verify network state restored to default 32980 1727096616.01594: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096616.01598: getting variables 32980 1727096616.01600: in VariableManager get_vars() 32980 1727096616.01645: Calling all_inventory to load vars for managed_node2 32980 1727096616.01649: Calling groups_inventory to load vars for managed_node2 32980 1727096616.01651: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096616.01666: Calling all_plugins_play to load vars for managed_node2 32980 1727096616.01776: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096616.01781: Calling groups_plugins_play to load vars for managed_node2 32980 1727096616.03797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096616.05404: done with get_vars() 32980 1727096616.05424: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:77 Monday 23 September 2024 09:03:36 -0400 (0:00:00.064) 0:00:27.981 ****** 32980 1727096616.05516: entering _queue_task() for managed_node2/include_tasks 32980 1727096616.05984: worker is 1 (out of 1 available) 32980 1727096616.05994: exiting _queue_task() for managed_node2/include_tasks 32980 1727096616.06004: done queuing things up, now waiting for results queue to drain 32980 1727096616.06006: waiting for pending results... 32980 1727096616.06133: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 32980 1727096616.06231: in run() - task 0afff68d-5257-457d-ef33-0000000000ab 32980 1727096616.06235: variable 'ansible_search_path' from source: unknown 32980 1727096616.06252: calling self._execute() 32980 1727096616.06351: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.06362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.06383: variable 'omit' from source: magic vars 32980 1727096616.06776: variable 'ansible_distribution_major_version' from source: facts 32980 1727096616.06780: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096616.06782: _execute() done 32980 1727096616.06790: dumping result to json 32980 1727096616.06797: done dumping result, returning 32980 1727096616.06883: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0afff68d-5257-457d-ef33-0000000000ab] 32980 1727096616.06886: sending task result for task 0afff68d-5257-457d-ef33-0000000000ab 32980 1727096616.06954: done sending task result for task 0afff68d-5257-457d-ef33-0000000000ab 32980 1727096616.06957: WORKER PROCESS EXITING 32980 1727096616.07012: no more pending results, returning what we have 32980 1727096616.07018: in VariableManager get_vars() 32980 1727096616.07071: Calling all_inventory to load vars for managed_node2 32980 1727096616.07077: Calling groups_inventory to load vars for managed_node2 32980 1727096616.07080: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096616.07094: Calling all_plugins_play to load vars for managed_node2 32980 1727096616.07096: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096616.07099: Calling groups_plugins_play to load vars for managed_node2 32980 1727096616.08680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096616.10257: done with get_vars() 32980 1727096616.10280: variable 'ansible_search_path' from source: unknown 32980 1727096616.10293: we have included files to process 32980 1727096616.10294: generating all_blocks data 32980 1727096616.10296: done generating all_blocks data 32980 1727096616.10302: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32980 1727096616.10303: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32980 1727096616.10305: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32980 1727096616.10697: done processing included file 32980 1727096616.10699: iterating over new_blocks loaded from include file 32980 1727096616.10701: in VariableManager get_vars() 32980 1727096616.10717: done with get_vars() 32980 1727096616.10718: filtering new block on tags 32980 1727096616.10735: done filtering new block on tags 32980 1727096616.10737: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 32980 1727096616.10741: extending task lists for all hosts with included blocks 32980 1727096616.13897: done extending task lists 32980 1727096616.13898: done processing included files 32980 1727096616.13899: results queue empty 32980 1727096616.13900: checking for any_errors_fatal 32980 1727096616.13903: done checking for any_errors_fatal 32980 1727096616.13904: checking for max_fail_percentage 32980 1727096616.13905: done checking for max_fail_percentage 32980 1727096616.13905: checking to see if all hosts have failed and the running result is not ok 32980 1727096616.13906: done checking to see if all hosts have failed 32980 1727096616.13907: getting the remaining hosts for this loop 32980 1727096616.13908: done getting the remaining hosts for this loop 32980 1727096616.13910: getting the next task for host managed_node2 32980 1727096616.13914: done getting next task for host managed_node2 32980 1727096616.13916: ^ task is: TASK: Check routes and DNS 32980 1727096616.13918: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096616.13921: getting variables 32980 1727096616.13922: in VariableManager get_vars() 32980 1727096616.13935: Calling all_inventory to load vars for managed_node2 32980 1727096616.13937: Calling groups_inventory to load vars for managed_node2 32980 1727096616.13939: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096616.13944: Calling all_plugins_play to load vars for managed_node2 32980 1727096616.13946: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096616.13949: Calling groups_plugins_play to load vars for managed_node2 32980 1727096616.15865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096616.19087: done with get_vars() 32980 1727096616.19106: done getting variables 32980 1727096616.19153: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 09:03:36 -0400 (0:00:00.138) 0:00:28.120 ****** 32980 1727096616.19388: entering _queue_task() for managed_node2/shell 32980 1727096616.19898: worker is 1 (out of 1 available) 32980 1727096616.19911: exiting _queue_task() for managed_node2/shell 32980 1727096616.19924: done queuing things up, now waiting for results queue to drain 32980 1727096616.19926: waiting for pending results... 32980 1727096616.20475: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 32980 1727096616.20875: in run() - task 0afff68d-5257-457d-ef33-000000000b17 32980 1727096616.20879: variable 'ansible_search_path' from source: unknown 32980 1727096616.20882: variable 'ansible_search_path' from source: unknown 32980 1727096616.20885: calling self._execute() 32980 1727096616.20952: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.20963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.20991: variable 'omit' from source: magic vars 32980 1727096616.21794: variable 'ansible_distribution_major_version' from source: facts 32980 1727096616.21814: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096616.21827: variable 'omit' from source: magic vars 32980 1727096616.21866: variable 'omit' from source: magic vars 32980 1727096616.21965: variable 'omit' from source: magic vars 32980 1727096616.22083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096616.22363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096616.22366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096616.22370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096616.22375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096616.22378: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096616.22381: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.22383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.22546: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096616.22616: Set connection var ansible_timeout to 10 32980 1727096616.22624: Set connection var ansible_shell_type to sh 32980 1727096616.22631: Set connection var ansible_connection to ssh 32980 1727096616.22644: Set connection var ansible_shell_executable to /bin/sh 32980 1727096616.22683: Set connection var ansible_pipelining to False 32980 1727096616.22708: variable 'ansible_shell_executable' from source: unknown 32980 1727096616.22977: variable 'ansible_connection' from source: unknown 32980 1727096616.22980: variable 'ansible_module_compression' from source: unknown 32980 1727096616.22983: variable 'ansible_shell_type' from source: unknown 32980 1727096616.22985: variable 'ansible_shell_executable' from source: unknown 32980 1727096616.22987: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.22989: variable 'ansible_pipelining' from source: unknown 32980 1727096616.22991: variable 'ansible_timeout' from source: unknown 32980 1727096616.22993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.23062: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096616.23300: variable 'omit' from source: magic vars 32980 1727096616.23303: starting attempt loop 32980 1727096616.23306: running the handler 32980 1727096616.23310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096616.23312: _low_level_execute_command(): starting 32980 1727096616.23315: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096616.24563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096616.24591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.24640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096616.24652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.24833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096616.24845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.24956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.26695: stdout chunk (state=3): >>>/root <<< 32980 1727096616.26735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.26764: stderr chunk (state=3): >>><<< 32980 1727096616.26887: stdout chunk (state=3): >>><<< 32980 1727096616.27001: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096616.27008: _low_level_execute_command(): starting 32980 1727096616.27011: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551 `" && echo ansible-tmp-1727096616.269102-34267-205089702690551="` echo /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551 `" ) && sleep 0' 32980 1727096616.28110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.28330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096616.28333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.28382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.30247: stdout chunk (state=3): >>>ansible-tmp-1727096616.269102-34267-205089702690551=/root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551 <<< 32980 1727096616.30581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.30585: stdout chunk (state=3): >>><<< 32980 1727096616.30587: stderr chunk (state=3): >>><<< 32980 1727096616.30589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096616.269102-34267-205089702690551=/root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096616.30591: variable 'ansible_module_compression' from source: unknown 32980 1727096616.30593: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096616.30595: variable 'ansible_facts' from source: unknown 32980 1727096616.30647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/AnsiballZ_command.py 32980 1727096616.30855: Sending initial data 32980 1727096616.30858: Sent initial data (155 bytes) 32980 1727096616.31886: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.32034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096616.32137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.32219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.33775: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32980 1727096616.33800: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096616.33820: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096616.33835: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmp22sjwv1y /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/AnsiballZ_command.py <<< 32980 1727096616.33846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/AnsiballZ_command.py" <<< 32980 1727096616.33895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmp22sjwv1y" to remote "/root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/AnsiballZ_command.py" <<< 32980 1727096616.34641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.34652: stdout chunk (state=3): >>><<< 32980 1727096616.34663: stderr chunk (state=3): >>><<< 32980 1727096616.34760: done transferring module to remote 32980 1727096616.34788: _low_level_execute_command(): starting 32980 1727096616.34930: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/ /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/AnsiballZ_command.py && sleep 0' 32980 1727096616.35939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096616.36069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096616.36288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.36351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.38113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.38126: stdout chunk (state=3): >>><<< 32980 1727096616.38141: stderr chunk (state=3): >>><<< 32980 1727096616.38158: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096616.38238: _low_level_execute_command(): starting 32980 1727096616.38241: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/AnsiballZ_command.py && sleep 0' 32980 1727096616.39283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.39306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096616.39330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.39399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.55326: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2991sec preferred_lft 2991sec\n inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:03:36.543361", "end": "2024-09-23 09:03:36.552120", "delta": "0:00:00.008759", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 32980 1727096616.55337: stdout chunk (state=3): >>> <<< 32980 1727096616.56859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096616.56864: stdout chunk (state=3): >>><<< 32980 1727096616.56875: stderr chunk (state=3): >>><<< 32980 1727096616.56901: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2991sec preferred_lft 2991sec\n inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:03:36.543361", "end": "2024-09-23 09:03:36.552120", "delta": "0:00:00.008759", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096616.56949: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096616.56958: _low_level_execute_command(): starting 32980 1727096616.56961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096616.269102-34267-205089702690551/ > /dev/null 2>&1 && sleep 0' 32980 1727096616.57444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096616.57448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.57451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 32980 1727096616.57453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096616.57455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.57507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096616.57515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.57546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.59399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.59403: stdout chunk (state=3): >>><<< 32980 1727096616.59407: stderr chunk (state=3): >>><<< 32980 1727096616.59427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096616.59572: handler run complete 32980 1727096616.59575: Evaluated conditional (False): False 32980 1727096616.59578: attempt loop complete, returning result 32980 1727096616.59580: _execute() done 32980 1727096616.59582: dumping result to json 32980 1727096616.59584: done dumping result, returning 32980 1727096616.59585: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0afff68d-5257-457d-ef33-000000000b17] 32980 1727096616.59587: sending task result for task 0afff68d-5257-457d-ef33-000000000b17 32980 1727096616.59665: done sending task result for task 0afff68d-5257-457d-ef33-000000000b17 32980 1727096616.59671: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008759", "end": "2024-09-23 09:03:36.552120", "rc": 0, "start": "2024-09-23 09:03:36.543361" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2991sec preferred_lft 2991sec inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 32980 1727096616.59749: no more pending results, returning what we have 32980 1727096616.59753: results queue empty 32980 1727096616.59754: checking for any_errors_fatal 32980 1727096616.59755: done checking for any_errors_fatal 32980 1727096616.59756: checking for max_fail_percentage 32980 1727096616.59758: done checking for max_fail_percentage 32980 1727096616.59759: checking to see if all hosts have failed and the running result is not ok 32980 1727096616.59760: done checking to see if all hosts have failed 32980 1727096616.59760: getting the remaining hosts for this loop 32980 1727096616.59762: done getting the remaining hosts for this loop 32980 1727096616.59765: getting the next task for host managed_node2 32980 1727096616.59776: done getting next task for host managed_node2 32980 1727096616.59779: ^ task is: TASK: Verify DNS and network connectivity 32980 1727096616.59784: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096616.59788: getting variables 32980 1727096616.59790: in VariableManager get_vars() 32980 1727096616.59833: Calling all_inventory to load vars for managed_node2 32980 1727096616.59836: Calling groups_inventory to load vars for managed_node2 32980 1727096616.59838: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096616.59851: Calling all_plugins_play to load vars for managed_node2 32980 1727096616.59854: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096616.59857: Calling groups_plugins_play to load vars for managed_node2 32980 1727096616.61008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096616.61878: done with get_vars() 32980 1727096616.61894: done getting variables 32980 1727096616.61965: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 09:03:36 -0400 (0:00:00.426) 0:00:28.546 ****** 32980 1727096616.62005: entering _queue_task() for managed_node2/shell 32980 1727096616.62497: worker is 1 (out of 1 available) 32980 1727096616.62506: exiting _queue_task() for managed_node2/shell 32980 1727096616.62517: done queuing things up, now waiting for results queue to drain 32980 1727096616.62519: waiting for pending results... 32980 1727096616.62644: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 32980 1727096616.62730: in run() - task 0afff68d-5257-457d-ef33-000000000b18 32980 1727096616.62744: variable 'ansible_search_path' from source: unknown 32980 1727096616.62748: variable 'ansible_search_path' from source: unknown 32980 1727096616.62803: calling self._execute() 32980 1727096616.62910: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.62914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.62917: variable 'omit' from source: magic vars 32980 1727096616.63286: variable 'ansible_distribution_major_version' from source: facts 32980 1727096616.63306: Evaluated conditional (ansible_distribution_major_version != '6'): True 32980 1727096616.63457: variable 'ansible_facts' from source: unknown 32980 1727096616.64049: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 32980 1727096616.64053: variable 'omit' from source: magic vars 32980 1727096616.64085: variable 'omit' from source: magic vars 32980 1727096616.64109: variable 'omit' from source: magic vars 32980 1727096616.64138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32980 1727096616.64164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32980 1727096616.64185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32980 1727096616.64198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096616.64208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32980 1727096616.64231: variable 'inventory_hostname' from source: host vars for 'managed_node2' 32980 1727096616.64235: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.64237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.64310: Set connection var ansible_module_compression to ZIP_DEFLATED 32980 1727096616.64313: Set connection var ansible_timeout to 10 32980 1727096616.64316: Set connection var ansible_shell_type to sh 32980 1727096616.64318: Set connection var ansible_connection to ssh 32980 1727096616.64326: Set connection var ansible_shell_executable to /bin/sh 32980 1727096616.64331: Set connection var ansible_pipelining to False 32980 1727096616.64348: variable 'ansible_shell_executable' from source: unknown 32980 1727096616.64350: variable 'ansible_connection' from source: unknown 32980 1727096616.64353: variable 'ansible_module_compression' from source: unknown 32980 1727096616.64356: variable 'ansible_shell_type' from source: unknown 32980 1727096616.64358: variable 'ansible_shell_executable' from source: unknown 32980 1727096616.64360: variable 'ansible_host' from source: host vars for 'managed_node2' 32980 1727096616.64362: variable 'ansible_pipelining' from source: unknown 32980 1727096616.64366: variable 'ansible_timeout' from source: unknown 32980 1727096616.64370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 32980 1727096616.64472: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096616.64481: variable 'omit' from source: magic vars 32980 1727096616.64486: starting attempt loop 32980 1727096616.64488: running the handler 32980 1727096616.64502: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32980 1727096616.64515: _low_level_execute_command(): starting 32980 1727096616.64522: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32980 1727096616.64996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096616.65000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.65003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096616.65005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.65062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096616.65065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.65096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.66736: stdout chunk (state=3): >>>/root <<< 32980 1727096616.66834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.66858: stderr chunk (state=3): >>><<< 32980 1727096616.66861: stdout chunk (state=3): >>><<< 32980 1727096616.66882: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096616.66892: _low_level_execute_command(): starting 32980 1727096616.66898: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880 `" && echo ansible-tmp-1727096616.6688156-34290-197542368311880="` echo /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880 `" ) && sleep 0' 32980 1727096616.67299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096616.67310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096616.67314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.67355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096616.67358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.67399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.69275: stdout chunk (state=3): >>>ansible-tmp-1727096616.6688156-34290-197542368311880=/root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880 <<< 32980 1727096616.69380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.69401: stderr chunk (state=3): >>><<< 32980 1727096616.69404: stdout chunk (state=3): >>><<< 32980 1727096616.69418: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096616.6688156-34290-197542368311880=/root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096616.69445: variable 'ansible_module_compression' from source: unknown 32980 1727096616.69487: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32980as596vvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32980 1727096616.69517: variable 'ansible_facts' from source: unknown 32980 1727096616.69575: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/AnsiballZ_command.py 32980 1727096616.69675: Sending initial data 32980 1727096616.69679: Sent initial data (156 bytes) 32980 1727096616.70079: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096616.70083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.70102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.70148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096616.70151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.70194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.71754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32980 1727096616.71806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32980 1727096616.71852: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32980as596vvb/tmpgunxy_2n /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/AnsiballZ_command.py <<< 32980 1727096616.71856: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/AnsiballZ_command.py" <<< 32980 1727096616.71937: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32980as596vvb/tmpgunxy_2n" to remote "/root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/AnsiballZ_command.py" <<< 32980 1727096616.72687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.72711: stderr chunk (state=3): >>><<< 32980 1727096616.72790: stdout chunk (state=3): >>><<< 32980 1727096616.72801: done transferring module to remote 32980 1727096616.72818: _low_level_execute_command(): starting 32980 1727096616.72828: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/ /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/AnsiballZ_command.py && sleep 0' 32980 1727096616.73476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32980 1727096616.73491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096616.73562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.73629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096616.73658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.73723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096616.75489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096616.75510: stderr chunk (state=3): >>><<< 32980 1727096616.75513: stdout chunk (state=3): >>><<< 32980 1727096616.75529: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096616.75532: _low_level_execute_command(): starting 32980 1727096616.75537: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/AnsiballZ_command.py && sleep 0' 32980 1727096616.75932: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32980 1727096616.75936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.75950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096616.75997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 32980 1727096616.76015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096616.76048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096617.01438: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6665 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7978 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:03:36.908635", "end": "2024-09-23 09:03:37.013206", "delta": "0:00:00.104571", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32980 1727096617.03020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 32980 1727096617.03050: stderr chunk (state=3): >>><<< 32980 1727096617.03054: stdout chunk (state=3): >>><<< 32980 1727096617.03078: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6665 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7978 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:03:36.908635", "end": "2024-09-23 09:03:37.013206", "delta": "0:00:00.104571", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 32980 1727096617.03112: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32980 1727096617.03119: _low_level_execute_command(): starting 32980 1727096617.03124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096616.6688156-34290-197542368311880/ > /dev/null 2>&1 && sleep 0' 32980 1727096617.03557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096617.03565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32980 1727096617.03598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096617.03602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32980 1727096617.03604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 32980 1727096617.03606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32980 1727096617.03655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 32980 1727096617.03659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32980 1727096617.03695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32980 1727096617.05491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32980 1727096617.05514: stderr chunk (state=3): >>><<< 32980 1727096617.05517: stdout chunk (state=3): >>><<< 32980 1727096617.05529: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32980 1727096617.05535: handler run complete 32980 1727096617.05556: Evaluated conditional (False): False 32980 1727096617.05565: attempt loop complete, returning result 32980 1727096617.05569: _execute() done 32980 1727096617.05572: dumping result to json 32980 1727096617.05577: done dumping result, returning 32980 1727096617.05585: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0afff68d-5257-457d-ef33-000000000b18] 32980 1727096617.05589: sending task result for task 0afff68d-5257-457d-ef33-000000000b18 32980 1727096617.05690: done sending task result for task 0afff68d-5257-457d-ef33-000000000b18 32980 1727096617.05693: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.104571", "end": "2024-09-23 09:03:37.013206", "rc": 0, "start": "2024-09-23 09:03:36.908635" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6665 0 --:--:-- --:--:-- --:--:-- 6777 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 7978 0 --:--:-- --:--:-- --:--:-- 8083 32980 1727096617.05758: no more pending results, returning what we have 32980 1727096617.05761: results queue empty 32980 1727096617.05762: checking for any_errors_fatal 32980 1727096617.05778: done checking for any_errors_fatal 32980 1727096617.05779: checking for max_fail_percentage 32980 1727096617.05781: done checking for max_fail_percentage 32980 1727096617.05782: checking to see if all hosts have failed and the running result is not ok 32980 1727096617.05782: done checking to see if all hosts have failed 32980 1727096617.05783: getting the remaining hosts for this loop 32980 1727096617.05788: done getting the remaining hosts for this loop 32980 1727096617.05792: getting the next task for host managed_node2 32980 1727096617.05808: done getting next task for host managed_node2 32980 1727096617.05810: ^ task is: TASK: meta (flush_handlers) 32980 1727096617.05812: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096617.05816: getting variables 32980 1727096617.05817: in VariableManager get_vars() 32980 1727096617.05857: Calling all_inventory to load vars for managed_node2 32980 1727096617.05859: Calling groups_inventory to load vars for managed_node2 32980 1727096617.05861: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096617.05875: Calling all_plugins_play to load vars for managed_node2 32980 1727096617.05877: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096617.05880: Calling groups_plugins_play to load vars for managed_node2 32980 1727096617.06826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096617.07683: done with get_vars() 32980 1727096617.07698: done getting variables 32980 1727096617.07746: in VariableManager get_vars() 32980 1727096617.07757: Calling all_inventory to load vars for managed_node2 32980 1727096617.07759: Calling groups_inventory to load vars for managed_node2 32980 1727096617.07760: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096617.07763: Calling all_plugins_play to load vars for managed_node2 32980 1727096617.07765: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096617.07766: Calling groups_plugins_play to load vars for managed_node2 32980 1727096617.08385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096617.09235: done with get_vars() 32980 1727096617.09251: done queuing things up, now waiting for results queue to drain 32980 1727096617.09253: results queue empty 32980 1727096617.09253: checking for any_errors_fatal 32980 1727096617.09255: done checking for any_errors_fatal 32980 1727096617.09256: checking for max_fail_percentage 32980 1727096617.09256: done checking for max_fail_percentage 32980 1727096617.09257: checking to see if all hosts have failed and the running result is not ok 32980 1727096617.09257: done checking to see if all hosts have failed 32980 1727096617.09258: getting the remaining hosts for this loop 32980 1727096617.09258: done getting the remaining hosts for this loop 32980 1727096617.09260: getting the next task for host managed_node2 32980 1727096617.09262: done getting next task for host managed_node2 32980 1727096617.09263: ^ task is: TASK: meta (flush_handlers) 32980 1727096617.09264: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096617.09266: getting variables 32980 1727096617.09267: in VariableManager get_vars() 32980 1727096617.09280: Calling all_inventory to load vars for managed_node2 32980 1727096617.09281: Calling groups_inventory to load vars for managed_node2 32980 1727096617.09282: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096617.09285: Calling all_plugins_play to load vars for managed_node2 32980 1727096617.09287: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096617.09288: Calling groups_plugins_play to load vars for managed_node2 32980 1727096617.09948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096617.10780: done with get_vars() 32980 1727096617.10794: done getting variables 32980 1727096617.10823: in VariableManager get_vars() 32980 1727096617.10832: Calling all_inventory to load vars for managed_node2 32980 1727096617.10833: Calling groups_inventory to load vars for managed_node2 32980 1727096617.10834: Calling all_plugins_inventory to load vars for managed_node2 32980 1727096617.10837: Calling all_plugins_play to load vars for managed_node2 32980 1727096617.10838: Calling groups_plugins_inventory to load vars for managed_node2 32980 1727096617.10840: Calling groups_plugins_play to load vars for managed_node2 32980 1727096617.11454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32980 1727096617.12302: done with get_vars() 32980 1727096617.12319: done queuing things up, now waiting for results queue to drain 32980 1727096617.12320: results queue empty 32980 1727096617.12321: checking for any_errors_fatal 32980 1727096617.12322: done checking for any_errors_fatal 32980 1727096617.12322: checking for max_fail_percentage 32980 1727096617.12323: done checking for max_fail_percentage 32980 1727096617.12323: checking to see if all hosts have failed and the running result is not ok 32980 1727096617.12324: done checking to see if all hosts have failed 32980 1727096617.12324: getting the remaining hosts for this loop 32980 1727096617.12325: done getting the remaining hosts for this loop 32980 1727096617.12326: getting the next task for host managed_node2 32980 1727096617.12328: done getting next task for host managed_node2 32980 1727096617.12329: ^ task is: None 32980 1727096617.12329: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32980 1727096617.12330: done queuing things up, now waiting for results queue to drain 32980 1727096617.12331: results queue empty 32980 1727096617.12331: checking for any_errors_fatal 32980 1727096617.12332: done checking for any_errors_fatal 32980 1727096617.12332: checking for max_fail_percentage 32980 1727096617.12333: done checking for max_fail_percentage 32980 1727096617.12333: checking to see if all hosts have failed and the running result is not ok 32980 1727096617.12333: done checking to see if all hosts have failed 32980 1727096617.12335: getting the next task for host managed_node2 32980 1727096617.12336: done getting next task for host managed_node2 32980 1727096617.12337: ^ task is: None 32980 1727096617.12337: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=79 changed=2 unreachable=0 failed=0 skipped=67 rescued=0 ignored=0 Monday 23 September 2024 09:03:37 -0400 (0:00:00.503) 0:00:29.050 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.17s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.22s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Create veth interface lsr101 -------------------------------------------- 1.18s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.07s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.03s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.97s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.92s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Install iproute --------------------------------------------------------- 0.87s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 0.81s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.73s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 0.69s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.67s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.61s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.60s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Gather current interface info ------------------------------------------- 0.57s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Verify DNS and network connectivity ------------------------------------- 0.50s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 0.50s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Get NM profile info ----------------------------------------------------- 0.49s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Set up veth as managed by NetworkManager -------------------------------- 0.47s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 32980 1727096617.12428: RUNNING CLEANUP